Tag Archives: Bandwidth Throttling

The Role of WAN Accelerators in Cloud Backups

As some cloud backup companies fold and close, many more new ones are opening doors. The cloud backup market has become very much crowded; and many are trying their best to differentiate themselves from the crowd.  Some of the new entrants are offering sophisticated hybrid cloud backup services, which makes it very difficult to compare the different options. However, many buyers consider the speed of data transfer to the cloud as one of the most important features to consider.

Data transfer speed can be greatly improved by WAN acceleration.

WAN Acceleration is a technology that is used to improve the effectiveness of data flow across a Wide Area Network (WAN). On the other hand, WAN Accelerator is a software or appliance that improves bandwidth in order to make the user’s experience more effective on WAN. Many believe that the speed of data transfer is limited to the speed the Internet connection of a company. Although this is true, technologies and software of the backup vendors play a role in increasing cloud replication speed with the help of WAN acceleration. Therefore, how do WAN accelerators work?  How does the vendor improve throughput? What are the things to explore in order to select the best service provider?

Here are some of the questions that companies should ask to find the best solution:

1 – How Does the Vendor Recuperate from Disruptions of Network?

Due to packet drops and network hiccups, the transfer of information is not an easy task. Thus, it is necessary that selected vendor offers an updated technology that can replicate all data to the cloud. Matured cloud backup services, that are using traffic optimisation, deploy resilient-resumption technology allowing them to quickly recover and maintain operations even when the system experiences some kind of network disturbance or packet loss.

2 – Does the Vendor Throttle Bandwidth?

Administrators use traffic technology to specify the bandwidth that can be consumed by backup traffic at different hours of the day (such as, high bandwidth during non-business hours and low bandwidth in business hours). Best vendors will use dynamic maximum transfer unit (MTU) to optimise throughput by sizing the amount of sent on each request, depending on the network connection of the company.

3 – How Does the Vendor Deduplicate the Data?

Sending data across WAN is not an effective way to accelerate data transfer. In order to improve the transfer of data across the WAN, data scientists came up with dedulication process.  Duplication helps in reducing the bandwidth of WAN by more than 90%. Despite the fact, most of the vendors make use of source-side type of deduplication to make certain that the used appliance maintains single file version. However, not all cloud vendors dedupe the information over the WAN (to ensure that replicated information does not already exist in the cloud backup repository).

The best of the breed cloud backup vendors use block-level deduplication over the WAN. Such technology checks the cloud repository to enquire whether a specific block is present in cloud backup or not. In case it is available, then it is not copied to the cloud. Conversely, if the block does not exist, it will be copied and transferred to the cloud.

You will be a well informed cloud backup consumer if you ask the above questions on network optimisation and WAN acceleration.

How Cloud Services Guarantee High Availability

As businesses become more and more dependent on access to their digital assets, there is a growing intolerance to outages and down times. Continuous availability and high availability anywhere, anytime with any kind of device is the mantra of the age. Businesses struggling to meet this demand turn to cloud services to fulfil these expectations. Cloud service providers too, in keeping with their promise, are making an all out effort to bring the right technologies to the table so that their customers are never offline and their data never becomes inaccessible, whatever the circumstances.

Continuous availability of information requires planning. Once the customer has identified the applications that are mission-critical and must be continuously available, cloud service providers will recommend continuous backup of these applications. The backup process is orchestrated quietly in the background with no disruption to the production systems. Technologies such as bandwidth throttling, are used to ensure that the backup process consumes only redundant bandwidth or a minimal bandwidth. Data transmitted to the remote server is then continuously replicated on to one or more geographically dispersed servers to create redundant stores of the same information for future recovery.

The cloud service provider is very disaster conscious and has the responsibility of ensuring that disasters that impact the cloud data centre do not get passed on to individual customers using the services in the cloud. As part of the disaster recovery plan for the data centre, the cloud service provider links together the primary and secondary servers (that are geographically dispersed) in failover configurations. Secondary servers kick start their activity the moment the primary server develops a glitch or fails in any manner. Customers accessing or updating information are seamlessly shifted over from the primary server to the secondary server. The operations are so smooth that customers may not even realise that they have switched servers during a production operation. In the process, the cloud service provider creates a time-pocket in which the primary server can be set right and brought back into operation. High availability is an automatic outcome of this operation.

Customers who have geographically dispersed activities can also take advantage of cloud services multi-site data storage functions. If the enterprise has branches in the local area where a replication server is housed, the enterprise can configure the local replication server to service the requests of the branch. This will cut down on any latency that may be experienced by the branch in accessing and working with data stored on the centralised remote primary server. Any updates or operations on the enterprise data can be reverse-replicated from the secondary server to the primary server continuously.

It is no wonder that high availability is the guarantee of cloud service providers.

Our Customers

  • ATOS
  • Age UK
  • Alliance Pharma
  • Liverpool Football Club
  • CSC
  • Centrica
  • Citizens Advice
  • City of London
  • Fujitsu
  • Government Offices
  • HCL
  • LK Bennett
  • Lambretta Clothing
  • Leicester City
  • Lloyds Register
  • Logica
  • Meadowvale
  • National Farmers Union
  • Network Rail
  • PKR

Sales question? Need support? Start a chat session with one of our experts!

For support, call the 24-hour hotline:

UK: 0800 999 3600
US: 800-220-7013

Or, if you've been given a screen sharing code:

Existing customer?

Click below to login to our secure enterprise Portal and view the real-time status of your data protection.

Login to Portal