All posts by Kris Price

How Do You Avoid Bad Backups?

Every backup process is fraught with risks. The cloud is no exception. However, unlike in tape and other kinds of backup, bad backups or backup failures in the cloud can be instantly tracked and corrected.

Backup failures in the cloud can occur at source. If the backup software manager is not resilient, power failures can disrupt a backup schedule and cause a backup failure. Most cloud vendors are conscious of this problem. The software comes with log files which immediately record the failure of the backup or backup reports can be set to automatically popup on restoration of power or manually called up by the system administrator monitoring the status of the backup.

Where the client based cloud software provisions for continuous backup, the backup failure is registered in the log and the backup will resume automatically on the restoration of power. In either case, proactive cloud vendors handle bad backups and backup failures by constantly monitoring the status of customer backups from their end of the chain and notifying the customer of a backup failure via email or telephone in addition to any alerting mechanisms that may be triggered at the client end.

Poor compression, de-duplication and encryption algorithms can generate bad backups. The data being encrypted, compressed and deduplicated may become corrupted by poorly constructed algorithms, making the data unrecoverable. A similar problem can arise at destination if the data is encrypted, compressed or deduplicated at the cloud vendor’s server with poorly structured algorithms. Mid process power failures may be blamed for other types of data corruption that can occur at the client or server side of the backup process.

Unauthorised listeners on the network, employees with malafide intent or even ignorant personnel can cause a good backup to go bad. While most cloud vendors attempt to prevent hijack of data during transmission and make an all out effort to safeguard customer data from malicious attackers, no system is absolutely hacker proof. Wise users will insist on maintaining multiple copies of their backup as insurance against possible corruption of any one copy.

Data replication, data mirroring are done at the server end of the chain by cloud vendors to ensure high availability and security of customer data. Many cloud vendors encourage their customers to maintain a local copy of their data in addition to the offsite copies that they create. Many vendors offer local backup devices as part of their package. The client based software creates a local copy of the data on the onsite device even as a cloud based copy is being created on the remote server.

We, at Backup Technology, understand the security needs of our customers. Our software logs every activity that is performed and backup failures are instantly reported. The continuous backup option enables the backup to automatically resume after power is restored, while a failed schedule can be completed by manually invoking the backup process. Our encryption, decryption and compression algorithms are well tested and proven. We replicate, mirror and maintain customer information in multiple servers that are geographically disbursed to ensure high availability and disaster recovery.

Conceptualising Cloud Based MIS

Conceptual design of an MIS is the output of an interactive, highly focused discussion between the business managers and IT professionals. It is a high level definition of the MIS objectives, guiding policies and constraints with reasoned consideration of viable inputs, storage, outputs, communication protocols and business processes for generation of alternate MIS designs and the selection of the best fit design for the organisation.

Input for MIS may be received from external sources or internal sources. For instance, a steel manufacturing company may receive inputs on market price of iron ore; cost of transportation etc from external sources. They may have information about iron ore smelting, cleaning and processing times from internal sources. They may have some intellectual property—a formula for extracting the iron from the iron ore cost effectively. The business managers and the IT professionals will have to decide how they will integrate the information received from these different sources and how they will communicate the re-ordered information with employees at different levels of the organisation.

If the organisation has a number of branches scattered across geographical regions, the MIS design will have to give a serious consideration to whether the data should be centralised or distributed. Both kinds of databases have their advantages and disadvantages will have data retrieval impact. The time to access; the speed of access; latency issues etc will determine how the organisation wants to make its data available to its employees. The volume of information available or generated by the system will impact capacity planning and have a role to play in the kind of scalability of system the organisation wishes to deploy. The sequential or relational nature of the information will further determine how the information is organised and made available.

Organisations may process information in batches or record by record. Combination approaches are also not uncommon. The use of sophisticated modelling techniques in information processing may require the use of complex applications such as CAD/CAM and these applications may have to be re-configured; re-engineered for cloud deployment. Simpler applications such as word processing may be deployed with public licensing or shared licensing systems.

Ultimately, the test of the system is in the output. The system design must ensure that the system will be capable of delivering the right kind of output to the right level of employee in time at the right frequency. The output may be visual or verbal. It may be direct or routed through the senior management.

It is important to get the conceptual level of MIS design for the cloud right. It is the basis on which the detailing for the cloud is built.

Cloud Based Data Vs. Unmanaged Local Data – Which One is Insecure?

Did you know that some organisations collect informational data whenever and wherever? But, in most cases, the collected data just sits idle. You might wander why the very data that they put a lot of energy to collect simply sits idle – this is due to lack of data management. This leads to unmanaged data and might prove to be costly.

Unmanaged data causes a lot of headaches. In fact, it is a complete waste of organisational resources. It can cause decision makers to make mistakes if unmanaged data is used, resulting in huge consequences. Therefore, collected data must be well managed, neatly organised, and used accurately.

The cloud offers reliable technologies for easy data management. Data that is well organised in the cloud can benefit organisations in three ways:

Compliance: if data is properly managed, it becomes legally compliant. E-discovery for legal purposes, and data protection will become very easy if data is managed well. As a result, This short and long run savings will be realised.

Efficiency: if data is organised, then its management becomes easier and automatic. As a result, less resources could be assigned for data management.

Profitability: organised data can help reveal useful data and bring it to the forefront, showing information about products and leads. Companies tracking information about product opportunities and leads can benefit by taking action promptly and capturing the customer. This can be done at the moment when the need is the greatest.

Cloud computing contributes greatly to data management. First, it removes data silos. Cloud backup and storage is used to direct organised data to the cloud, allowing consolidation and organisation. As a result of the cloud usage, data can be organised in a single database.

With unstructured data, the organisation has no idea what is being leaked and who is doing the leaking or who is harvesting the data from their systems! Data sent to cloud repositories is encrypted. The encryption key is available only with the data administrator of the organisation. Anyone attempting to access the cloud database without appropriate authorisation is tracked, logged and can be instantly located. Even when the data access attempt has been successful, the encryption will deprive them of the pleasures of being able to gain the access.

As you can understand from this discussion, it is actually costly and disastrous to have unstructured data in your organisation than having it in the cloud.

Understanding Cloud Risks and How to Mitigate Them

Doomsayers will say what they must, but, you need to prepare for the worst and plan to deliver the best of the cloud to your organisation. It is true that there are risks attached to cloud computing. But, these risks are controllable if you know what they are and how they need to be handled. Let us examine some of the risks and understand how they can be mitigated.

The use of the Internet as the network makes cloud computing vulnerable. Several issues are often highlighted in context. First, the APIs used by cloud vendors are not checked for vulnerabilities by cloud users. This creates security problems. Second, most cloud solutions are multi-tenant solutions and vulnerabilities may be introduced at the software level, making the user content pregnable. Third, cloud computing may create several compliance risks, whatever the vendor may claim. Finally, proliferation of cloud subscriptions can complicate budgeting and IT spending and make it difficult to recognise patterns of spending, given the fact that cloud billing is usage driven.

Take a close look at the statements. You will realise that each of these problems can be effectively countered with the right tools. It is obvious that there are a number of measures that can be taken to ensure that any transactions conducted over the Internet are inviolable and cloud subscriptions do not become uncontrollable. All it requires is a little bit of common sense, tons of enforceable discipline, and plenty of planning. This is the most boring part of cloud migration, but an effort that will yield huge dividends to any organisation in the long run.

Companies launching on their cloud venture must lay down clear ground rules. A cloud purchase policy will be a good starting point. The policy must stipulate that all cloud subscriptions must be authorised by a central authority and branches or mobile users cannot subscribe to cloud vendors of their choice. If one centralised authority does all the cloud buying, it will be possible to have a complete and holistic picture of cloud spending.

Further, the centralisation of purchases will facilitate any other risk mitigation activity that the organisation may wish to undertake. For instance, the IT Administrator can examine in some detail the software deployments offered by the cloud vendor. Any multi-tenancy environments can be vetted before they can be accepted. The compliance offerings can be tested before data is transferred to the cloud. The encryption protocols offered can be evaluated for impregnability of the information being transmitted over the Internet and stored in the cloud repository. It will be a good idea to check on the disaster recovery functionalities before the final commitment is made.

The bottom line is: you are responsible for the security of your cloud. Forget what the doomsayers have to say!

The Logic of Versioning

Everyone who has worked with someone else will understand and appreciate the need for versioning. One member of the team may create the document and others may engage in critical evaluation of the contents of the document. Each modification to the document results in the original contents of the document being lost forever. If at some point, the team would like to go back to the original version of the document, the same may not be available to them if all changes have been made to the original document, and the changes have been saved. Versioning helps teams save the first document as the original version and every modified document as a modified version of the original. In the circumstances, if the original document has to be revisited or restored, users will simply have to call up the document saved as the original.

Versioning technology is often packaged as part of a Document Management System (DMS). Each vendor of the technology may apply a different versioning algorithm and logic for identifying and numbering different versions of a document. But, fundamentally, versions are registered sequentially. For instance, a versioning algorithm may number a Document as A v.1 and all subsequent versions may be numbered as A v.1.1, A v.1.2, and so on till the ceiling on the number of versions that can be stored is reached. The date of creation of the document version (generally the system date) may also be used as the version numbering mechanism.

Most backup and recovery systems have a limitation on the number of versions of a document that can be saved on the DMS. Some vendors allows users save only a few tens of versions while others may permit storage of a hundred versions of the document. As new versions of the document are added to the database, older versions may be archived, deleted or removed automatically from the storage repository. Users who wish to store more than the stipulated number of versions of a document may have to rename the document and store versions of the new original.

Versioning technology may be linked with de-duplication technology. Since de-duplication technology looks for exact duplicates for elimination, all versions of a file that are duplicate of the original will be automatically eliminated. Versioning may be linked with incremental and differential backup to ensure that only portions of the file that are modified are stored in the new version of the document with references to the original portion of the file for file build up during recovery.

With the advent of cloud backup and cloud computing, versioning technologies have become immensely sophisticated and document management has been fine-tuned to accommodate the multi-various needs of its patrons.

Advantages of a Hosted Email Service

An increasingly mobile world requires agile computing solutions. In this context, swapping the on premise Microsoft Exchange servers for hosted email services is not a bad idea. It saves money, improves productivity and reduces maintenance. The economics of it all is too attractive to ignore.

Having said that, it is important to point out that each organisation is unique and has unique needs and cost savings from the swap will be as little as 30% or as much as 70%. The actual or ball park figure will have to be arrived at by each organisation individually at the start of the exercise.

Cloud productivity suites present an option. For instance, Google Apps will cost users $5 per user per month. If Google Vault is added to the soup, the costs may go up to $10 per month per user. HyperOffice may start at $15 per user per month. Microsoft Office 365 has plans that range from $8 to $22 dollars. Each of these cloud based applications will include a full suite of productivity tools and some minimum storage space per user.

The advantage of moving to a hosted email service is that a number of collaboration and communication tools come with the package. File storage, social networking and document editing tools are often thrown in to enhance their appeal. Furthermore, all or any of these applications can be used by the end users from anywhere, anytime and from any kind of device.

As a result, all sizes of organisations can institute and implement Bring your own device (BYOD) policy without worrying about mobile device management and security of organisational data. The hosted system is not tied to any particular hardware or software and releases the organisation from the shackles that held it down to a particular geographical location.

However, hosted email services demand their pound of flesh. A lot of planning and implementation efforts are needed to successfully commission the service in any organisation. The implementing entity must have a clear understanding of what the business demands and what its processes involve. Without that understanding, the implementation efforts will fail from the word “go”.

For those who are wary of venturing into the unknown, Microsoft Exchange can be abstracted to the cloud using Amazon Web Services or Rackspace.

Our Customers

  • ATOS
  • Age UK
  • Alliance Pharma
  • Liverpool Football Club
  • CSC
  • Centrica
  • Citizens Advice
  • City of London
  • Fujitsu
  • Government Offices
  • HCL
  • LK Bennett
  • Lambretta Clothing
  • Leicester City
  • Lloyds Register
  • Logica
  • Meadowvale
  • National Farmers Union
  • Network Rail
  • PKR

Sales question? Need support? Start a chat session with one of our experts!

For support, call the 24-hour hotline:

UK: 0800 999 3600
US: 800-220-7013

Or, if you've been given a screen sharing code:

Existing customer?

Click below to login to our secure enterprise Portal and view the real-time status of your data protection.

Login to Portal