The term “bandwidth” has been used in Electrical Engineering for years to mean “the difference between the upper and lower frequencies in a continuous set of frequencies, measured in hertz”. In the early 1990’s Telcos started to use the term “bandwidth” to describe the volume of data handled and defined it as “the transmission rate of data across a network”. Bandwidth in data transmission is measured in bits per second and it represents the capacity of network connection. Increase in the capacity means improved performance, considering other factors like latency. We will further discuss the effects of utilisation of bandwidth to the challenges associated with cloud computing.
Cloud computing providers usually calculate required bandwidth of customers just by considering the available quantity of bandwidth as well as the mean bandwidth utilisation needed by variety of applications. In addition, cloud computing providers consider latencies in transmission to calculate the required time to upload both the initial backup and all subsequent backups. For that reason, Internet based cloud backup service providers work hard to enhance the overall Internet bandwidth. They also do everything within their power to reduce the amount of data that flows through their pipes. There are many things the cloud service providers do to achieve such goals. They can use incremental backup technologies, link load balancing technologies or even some exceptional binary patching to transmit and extract file’s changes so as to reduce/ balance the transmitted amount of data. In addition, both de-duplication and file compression techniques may be used to decrease the quantity of files that are transmitted over the network.
Information Technology administrators are often advised to determine, as accurately as possible, the quantity of bandwidth that will be needed by the organisation for both data storage and transfer operations in the cloud, and also the latency, expressed in milliseconds. Therefore, they have to consider the number of users and systems that will be responsible for pushing data into the available network space for data storage and other functions at non-peak and peak hours.
The online backup that is built with self-service enabling features, augmented by administrative interface that is user-friendly can easily be built with tools that allow customers to select the things that can be backed up at any point in time. There is also inbuilt filter, which enables users to add or remove files as well as folders from their sets of backups. There may be scheduling of backup sets for upload at different points in time in the Internet based servers. Massive data transfers can be scheduled during off-peak times of the organisation, increasing the speed of the transfer of data to when there is enough bandwidth.
Nevertheless, it is important for you to know that there is interconnection between bandwidth and latency. The connection speed of the network is the function of latency and bandwidth. Latency cannot be decreased drastically, however, bandwidth can be increased at anytime.