Tag Archives: Recovery Point Objective (RPO)

Managed Service Providers for Cloud Data Backup & Recovery

In the cloud backup and storage market, Asigra is considered cost effective, resourceful, compliant, secure, and reliable solution. Asigra software was launched in the year 1986 as data backup solution. Service providers, who are interested to provide data backup and storage services to organisations, consider partnering with Asigra to offer a reliable and secure service. Asigra software can be configured to scale up or down; makes data management easy, and offers high tech solutions, with excellent features.

In fact, Asigra cloud backup and recovery software can access the network to prepare backup for applications, operating systems, and file systems by utilising high tech industry recognised standards. Asigra architecture saves time, offers support, and requires a few resources to work. Managed service providers are able to encrypt important data of clients. Asigra is reliable to transmit encrypted data from desktop, repositories, laptop or remote offices without any security threats.

Since Asigra launched its software back in 1986, no hacks has been reported. As a result, there has not been any data loss or breaches. The software allows all kinds of data to be backed up with the help of a digital signature. All important data on disk is based on self-describing format. Asigra software activates certificate of destruction automatically for particular data that is wiped out from the repository. Security procedures are improved through password rotation and password management features. There are options to create new effective passwords that can be changed randomly on regular basis to improve security.

The Continuous Data Protection (CDP) feature of Asigra offers unrestricted granularity. Customers are given advantage to recover data through RPO (Recovery Point Objective). When CDP files are saved by users at computers, backup is made to avoid data loss. Moreover, this software pays attention on any change in target file and brings same changes in backup file as soon as these changes are noticed. In other words, Asigra is dependable backup option, effective for email backup, as well as file system to enhance recovery procedures of critical data.

Benefits of Asigra CDP:
• CDP gives surety that updated data on hard drive is backed up;
• Traditional backups were able to recover data that was backed up only. However, CDP is automatic and it is backed up to the cloud happens as soon as data is saved on the drive;
• CDP offers latest type of data protection system, where record of previous data is also maintained;
• Asigra can optionally make local backups and encrypt the data and backup to a centralised repository. As a result, recovery time objectives and backup window are decreased to a minimum.
Service providers understand the significance of legally compliant Asigra. Automatic disk based solutions are capable to work without human interference. In order to recover important information, backed up data is collected from offsite repositories through private or public cloud.

Asigra’s on-boarding features minimise time needed to go to market as well as operational costs for backup and data recovery. It allows service providers to develop a variety of services along with numerous service level agreements. Through self service option, end users are given chance to create new accounts and integrated vaults. Clients are able to do these tasks without interference of service provider. In short, MSPs that are using Asigra software are able to meet the demands of customers.

What is Pre-Process De-Duplication?

The context of Pre-Process De-duplication is also known as Source De-duplication in most case. The source de-duplication is mostly used for de-duplication of data prior to its transmission to the device for storage. The entire data are channelled via the source de-dupe software or hardware before being ready to be transmitted to the storage device where it will be stored. The major objective of source de-duplication is to prevent sending of duplicated data across the network to the device where it will be stored. There is establishment of connection using the designated storage device as well as evaluation of data prior to the time when it will initiate the de-duplication process. The synchronisation with the target disk upholds all through the process in order to ensure synchronisation of data, removing the files that match at the source. The main advantage of this is that it helps save bandwidth for the user.

In order to identify changed bytes, there is always the need for byte level scans by either the source de-dupe software or hardware. To make recovery easy for the user, the changed bytes are transferred to the destination or target device, pointing it to the original indexes and files updated with the pointer. Indeed, it does not take time to control the entire operation as they happen quickly without compromising the accuracy and efficiency of the process. The process of source de-dupe is light on processing power when compared to post process de-dupe. It has been observed that source de-duplication has the capability to categorise data in real time. The device configurations that are based on policy can classify data at granular levels, as well as filter out data while they pass across the source de-dupe device. There can be addition or removal of files on usual basis of the group, domain, user, owner, age, path, file type, or storage type, or even on the basis of RPO or retention periods.

Having said the advantages of source de-dupe, there are some disadvantages associated with source de-dupe. It is true that source de-duplication helps decrease the bandwidth you need to transmit data or files to the destination or target, however, there is imposition of higher processing load on the clients, as the entire process is involved in the source de-duplication. In addition, the central processing unit (CPU) power consumption of your device will go higher by about 25% to 50% during source de-duplication process, which may not really be favourable for you at all. There may be needs to incorporate source based de-dupe nodes into each of the locations connected. This involves more cost and will obviously be more expensive than the target de-duplication techniques, where all the de-duplications are carried out on one de-duplication device, within the network nodal point.

Lastly, if the existing software does not support de-duplication hardware or algorithms, there may be the need for redesigning of the software. This, however, is not a problem in target de-duplication, where there is isolation of de-dupe hardware and software from the organisation’s hardware or software. Also, there are no changes needed at the source de-dupe.

What are the Most Attractive Features of the Cloud?

What are the most attractive features of the Cloud? Is it the its dependability? Its scalability? Its flexibility? Its high availability? and/or its disaster recovery features? Well, it all depends on what your company’s needs are.

If your company has a low Recovery Point Objective (RPO) and RTO (Recovery Time Objective), then the cloud will be the right choice for you. Cloud vendors are able to promise uninterrupted customer service and 99.999% (with five nine’s) uptime for a variety of reasons, including the creation of hot sites and replication sites as part of their basic service. You can look forward to continuous, and for the most part, uninterrupted service if you are a cloud customer. In fact, disaster recovery is automatic with the cloud. Loss of data and time due to interruptions in service are not issues for cloud based services. This is a very appealing feature for many companies.

Scalable
Other companies might value the fact that the Cloud is adaptable and can be tailored to meet its growing needs. If a business needs to expand its cloud services, it can do so immediately, with no delay. A company can have access to cloud services as soon as they require them because hardware and software do not need to be purchased in order to upgrade a company’s service.

Fast
When a company opens a new branch office, there needs to be no lag time to offer the new office access to the head office data, and files on the web, thanks to the cloud. This gives companies greater flexibility, knowing that their employees can have access to data in the cloud almost as soon as they need it. It makes opening new branch offices much easier. In fact, it might just be a deciding factor when a company decides to expand its operations.

Mobility
Then, there are companies that might value the fact the cloud mobility offers its employees. The cloud makes it possible for business travellers to travel light, and access their data or documents anytime, wherever they are in the world. They can become paperless travellers needing only a smartphone, tablet or notebook in order to access the cloud. Travelling for business has never been so easy, thanks to the cloud.

Inexpensive
And finally, some companies may simply value how cost effective it is to access cloud based service. No huge capital outlay is required to engage with the cloud and cloud related costs can be managed very efficiently. For cost conscious companies, the cloud is a godsend allowing them to add cloud based services as they need them and not having to worry that it will break the bank.

Therefore, the cloud has many attractive features which will increase over time as cloud technology keeps improving. It just depends which features your company values most, but rest assured that your company won’t be disappointed by what the cloud has to offer.

Top Ten Reasons to Leave your Cloud Backup Service Provider – Part I

Business relationships are important. You have done your homework. You have researched and tested several solutions and settled on one that you thought was a great cloud backup vendor. Before you picked this company, you considered several factors, such as: technology, experience, financial status, reputation, security, compliance, support, certification, scalability, and trust. But, now, the vendor is taking it all for granted and is providing you with substandard services, resulting in not so good relationship.

Is your relationship with your cloud backup vendor healthy? If your business relationship starts to show some signs of stress, chances are the relationship will die at one point. Perhaps, it is time for you to gauge your business relationship. If you notice any or all of the following points, you might be in a bad relationship:

1/ Data Backup – the company doesn’t backup all of your data across all operating systems, and on mobile devices. Are you using various backup solutions across operating systems (iOS, Windows, etc.) and across mobile devices?

2/ Appliance – Is the vendor appliance centric? Do you find yourself spending more than what you planned for appliances? Is the vendor requesting you to acquire additional appliances to match with your backed up data? Relying heavily on appliances might not be an ideal solution. Cloud centric solutions, however, offer unlimited scaling when your data grows. Is your data being tethered to an appliance instead, and as a result, forcing you to delete data and/or buy bigger appliance to gain extra space for your growing data?

3/ SLA – Service Level Agreements are very important. SLAs have a purpose and that is why a great deal of effort is put into preparing them. Does the vendor execute per signed and approved SLA?

4/ Price – the price the vendor is charging you varies all the time, and is complicated, and you can not figure out how the pricing model works. Is it per GB of raw or compressed data? Do you get credit for not recovering data, say, in the past one year?

5/ BLM – Does your vendor treats all data the same and back them all up in the same vault? Keep in mind that all data has the same value. The older a data gets, the less important it becomes. So, mission-critical data should be stored separately with clearly defined RTO and RPO while less important data should be stored in less expensive vaults. Intelligent software have the ability to automatically segment the data into these two tiers.

If you decide to move your services to a new vendor, make sure that you don’t end up with the same problems as the vendor you just switched from. Insist on asking the new vendor to help with the data migration, at least consultation help. Remember that choosing a cloud backup service provider is not a simple task; and the vendor you choose could end up causing you to go out of business.

In Part II, we will discuss other five factors that affect your relationship with your vendor, such as bandwidth throttling; data centre location; vendor lock-in; DRaaS; and periodic vendor research results.

Data Protection for Virtual Machines—Some Concepts

Virtual machine protection is one of the most challenging tasks of cloud computing. More than backing up the data, it is recovering the data that needs focus. Defining a few metrics upfront will help organisations backup and recover their data efficiently.

The most important metric in Virtual machine recovery is Recovery Time Objective (RTO). How quickly must the virtual machine be recovered and made operational in the event of disaster or human error? This metric has to be defined by the business managers and not the IT personnel. The answer to this question will vary with the kind VM that needs to be recovered. If the VM is mission-critical, the RTO will be very tight and the organisation may need to be able to switch to a hot site or disaster recovery site at the point of failure. They will have to select a continuous backup option with continuous mirroring / replication of data to an alternate site for high availability. If the VM is non-critical, organisations can afford to go slow on the recovery. The organisation can afford to wait for the recovery of the primary server for a specified time frame.

Closely linked with RTO concept is the Recovery Point Objective (RPO) concept. The recovery point objective specifies the acceptable data loss window. Can the organisation afford to lose a few minutes or a few hours of data inputs? This is because the data input at the point of failure may not have been saved and may not be available to the organisation for recovery. If the organisation cannot afford to lose even a few minutes of data entry, the data protection strategy that should be selected is a continuous data backup system. If a few hours of data loss does not make much of a difference, the organisation can afford to set up scheduled backup systems.

An unfortunate aspect of virtual machine deployment is the potential for creating Virtual Machine sprawls. This is because virtual machines can be easily created on the fly and no additional hardware or software needs to be requisitioned. The resource impact of these rogue virtual machines can be immense and organisations that have this problem on hand will have to undertake the exercise of rationalising their VM deployments or provisioning for time required for backing up and recovering these machines.

It should be noted at this point in the discussion that there may be several adjunct systems that need to be recovered along with the VM. The Recovery Time Objective and the Recovery Point Objective that may be defined by the business must take into consideration the time required for the recovery of these systems in addition to the recovery of the VM.

Our Customers

  • ATOS
  • Age UK
  • Alliance Pharma
  • Liverpool Football Club
  • CSC
  • Centrica
  • Citizens Advice
  • City of London
  • Fujitsu
  • Government Offices
  • HCL
  • LK Bennett
  • Lambretta Clothing
  • Leicester City
  • Lloyds Register
  • Logica
  • Meadowvale
  • National Farmers Union
  • Network Rail
  • PKR

Sales question? Need support? Start a chat session with one of our experts!

For support, call the 24-hour hotline:

UK: 0800 999 3600
US: 800-220-7013

Or, if you've been given a screen sharing code:

Existing customer?

Click below to login to our secure enterprise Portal and view the real-time status of your data protection.

Login to Portal