Tag Archives: Compression

Establishing Successful Cloud Computing Services

One method of ensuring that parties to a contract are on the same page as regards expectations and their fulfilment is the drawing up of service level agreements (SLAs). These agreements clearly specify what the vendor is willing to deliver and what the customer can expect to receive with reference to a cloud services contract. SLAs form an important management tool and are often formally negotiated and have specific metrics to quantify delivery of agreed services.

Before discussing the “how to” of establishing a successful business relationship in the cloud, let us quickly review the “bare minimum offering” in the cloud:

1. Readily available computing resources are exposed as a service;
2. The economic model is generally a pay-as-you-go service;
3. May or may not process data into meaningful contexts;
4. Limited guarantees on scalability, reliability and high availability;
5. Security systems are designed to be reasonably hacker proof;
6. Supports environmental goals by reducing carbon footprints;
7. Provides monitoring tools and metrics for evaluating services.

A quick think through, on the offerings of short listed cloud vendors, will establish the decision points for the relationship and the drafting of the SLA. The enterprise must have clarity on:

1. Whether the kind of service being offered by the vendor is the kind of service the enterprise needs?
2. Whether the definition of the “unit” measure of service is determined and can be monetised.
3. Whether the enterprise wants the service provider to process the data into meaningful contexts using compression or de-duplication technologies or it wants the data to be stored “as is where is”.
4. Whether the scalability, high availability and reliability can be truly obtained via the service. The enterprise must examine in some detail the technical claims being made by the service provider and feasibility thereof. A quick market research on the reputation of the vendor will also help in decision making.
5. Whether security guarantees are backed by industry best practises and third party certifications of cryptographic algorithms and user acceptance.
6. Whether green computing options are strictly enforced by the vendor and
7. Whether the service monitoring tools provided will truly reflect the level of service being provided by the vendor.

We, at Backup Technology, believe in working with our customers in a trustful relationship. The Service Level Agreements (SLAs) we design is guaranteed to satisfy the most stringent monitoring requirements and reflects the kind of relationship we seek to establish with our customers.

Efficient, Stable De-duplicating Processes

Storage needs are ballooning. Data volumes will soon overwhelm organisations like a never receding Tsunami if nothing is done about it. There are very few choices. Organisations must:

1. Get more storage space;
2. Archive / take offline data that is no longer relevant; or
3. Compress the data stored.

While falling disk prices and innovative storage systems have made it possible to hire more disk space or archive data effectively, it is data compression that has received a lot of attention in recent years. Compression not only saves disk space, it also saves bandwidth required for transmission of data over the network. Data compression includes data de-duplication and is relevant both for data storage and data archival.

Disk based “de-duplication systems” compress data by removing duplicates of data across the data storage system. Some implementations compress data at a ratio of 20:1 (total data size / physical space used) or even higher. This may be done by reducing the footprint of the versioned data during incremental or differential backup.

Vendors use a variety of algorithms to de-duplicate data. Chunking algorithms break the data into chunks for de-duplication purposes. Chunking can be defined by physical layer constraints, as sliding blocks or single instance storage algorithms. Client backup de-duplication systems use hash calculations to evaluate similarity between files for removal and referencing of duplicates. Primary and secondary storage de-duplication designs also vary. While primary storage de-duplication is directed towards performance optimisation, secondary storage is more tolerant of performance degradation and hence de-duplication algorithms are constructed with more leeway.

Until recently, data de-duplication was only associated with secondary storage. The increasing importance of the cloud as a primary storage has created a lot of interest in de-duplication technologies for primary storage.

We, at Backup Technology, offer an agentless cloud backup service powered by Asigra. Our software de-duplicates data at source and transmits only the compressed, bandwidth efficient incremented or differentiated ‘bits and bytes’ or blocks of the backup set, over the network. Global level de-duplication technology at source is also available for the hybrid versions of the cloud for local backups. The comprehensive solution is non-invasive and ideally suited for small, medium businesses, as well as enterprises. Why don’t you try out our software before you decide to commit yourself?

Functionality, Quality, Price—The Evaluation Parameters for the Cloud

IT budgets do not scale in proportion to IT needs. Data growth outstrips infrastructure and headcount growth. The CIO is forced to compromise.

What if the enterprise could become instantly IT enabled with very little investment in infrastructure, software or HR?

Utility computing in the cloud serves the enterprise with what they need, when they need it, through any channel and any kind of device. The technology integrates and automates the value chain, adapts easily and innovates constantly. Risk and environmental responsibilities are well orchestrated and everything is streamlined to deliver ‘best fit’ services. Functionality, quality and price are definitely attractive.

Cloud computing enhances the efficiency and functionality of the enterprise. Cloud storage systems are developed to support “on demand” utility computing models — SaaS, PaaS and IaaS — with intent to deliver IT as a service over the Internet. Users can scale up or scale down on infrastructure or space instantly and pay only for what they use. Mobile and remote computing technologies are made available for disparate parts of the business and mobile workers can synchronise their activities with that of the parent business from wherever they are. Employees can collaborate with each other or access business applications from the central server. User and usage management policies can be implemented by exploiting the functionality inbuilt into the cloud application.

Quality of service delivery is the unique selling point (USP) of cloud vendors. QOA distinguishes them from the competition and builds trust in business relationships with their customers. Cloud vendors are conscious that their services are evaluated on the basis qualitative factors, such as design and delivery of security systems, compression and de-duplication of data or speed of backup and recovery. The way the services are packaged together also makes a difference.

Economies of scale, deriving from multi-tenancy computing models, make the cloud attractive to cash strapped enterprises. The pay per use model, typical to the utility services sector enables small and medium enterprises with small budgets garner and use resources that were earlier only available to their larger brethren. Additionally, CAPEX vanishes and is replaced by OPEX. This makes it wholly attractive to managements who do not want to invest scarce resources in IT infrastructure to the detriment of other business activities.

Support services provided by the cloud shrinks IT expertise requirements within the enterprise. Hardware and software maintenance in the cloud is the responsibility of the cloud vendor. The vendor is also committed to ensuring high availability of customer information and 99.9% uptime. Responsibility for mirroring, replication, de-duplication, compression and secure storage of information is transferred to the cloud vendor. A single IT Administrator can manage the database and maintain offsite copies of the data for additional data availability.

We at Backup Technology, offer the best of the breed public, private and hybrid cloud services to our customers unfailingly. We anticipate customers’ every need and work towards providing them with the functionalities they require without compromising on quality. Our pay per use pricing model is economical and wholly affordable. For more information, please do visit our website: www.Backup-Technology.com.

Keeping Costs Under Control

The traditional ontology of costing and micro theories of costing may need to be redefined in the virtual world where value adding activities may be less visible except in areas of strategic alliance, information sharing or online payment mores. The theories may have to be generically deployed to achieve cost management irrespective of whether the enterprise belongs to the manufacturing or service sector. The criteria for evaluation may include generality, efficiency, perspicuity, transformability, extensibility, granularity, scalability and competence.

So, how does one keep costs under control? The process must logically begin, by recognising the characteristics of the cloud and identifying the appropriate / suitable costing methods for delivery and use of services.

• The cloud consolidates the distributed enterprise and shifts all activity–online–to the remote data centre in the cloud. This largely precludes the possibility of pinpointing
specific activities to specific cost centres.
• CAPEX vanishes with the cloud and is replaced by OPEX. Direct expenses suddenly assume the aspect of indirect expenses.
• Logistics costs and information management costs form a large chunk of the costs for the virtual enterprise.

Consequently, cloud vendors adopt use-based, granular costing methods for estimating the costs of maintaining a cloud and servicing the customer. There is a defined base cost and a top up cost for the service. While base costs are fixed, the top up costs vary in accordance with the level of usage of the services. The secret of cost control, in context, is simply to monitor the amount of base plus top up costs or to evaluate storage space or application use of the different participating departments and define limits of usage for budgeting purposes.

However, the process is not as simple as it seems. The implication is that enterprises will have to get more sophisticated about shaping the use of resources within the enterprise. They will have to recognise utilisation risks and the management will have to play a significant role in ensuring sufficiency of usage and financial viability of the use. Cost saving opportunities will involve using cloud based applications only where the use makes business sense and scaling up operations in the cloud only when there is a peak transaction load. Since pricing in the cloud is directly proportional to utilisation, IT financial talents must be made conscious that IT operations must become more business like in their dealings if costs are to be kept under control in the cloud. IT Administrators must lock step with business growth and success and optimise on use of cloud based IT resources to advantage the business.

We, at Backup Technology, are always ready to work with our customers in evaluating the economics of using the cloud. Our cloud services are designed to optimise resource use for our customers. Our customers remain informed on usage data and have the option to scale up or scale down usage on-demand. Alerts are generated when usage boundaries are reached. Compression, de-duplication and archiving technologies are used to help our customers achieve cost savings in the cloud.

How Do You Avoid Bad Backups?

Every backup process is fraught with risks. The cloud is no exception. However, unlike in tape and other kinds of backup, bad backups or backup failures in the cloud can be instantly tracked and corrected.

Backup failures in the cloud can occur at source. If the backup software manager is not resilient, power failures can disrupt a backup schedule and cause a backup failure. Most cloud vendors are conscious of this problem. The software comes with log files which immediately record the failure of the backup or backup reports can be set to automatically popup on restoration of power or manually called up by the system administrator monitoring the status of the backup.

Where the client based cloud software provisions for continuous backup, the backup failure is registered in the log and the backup will resume automatically on the restoration of power. In either case, proactive cloud vendors handle bad backups and backup failures by constantly monitoring the status of customer backups from their end of the chain and notifying the customer of a backup failure via email or telephone in addition to any alerting mechanisms that may be triggered at the client end.

Poor compression, de-duplication and encryption algorithms can generate bad backups. The data being encrypted, compressed and deduplicated may become corrupted by poorly constructed algorithms, making the data unrecoverable. A similar problem can arise at destination if the data is encrypted, compressed or deduplicated at the cloud vendor’s server with poorly structured algorithms. Mid process power failures may be blamed for other types of data corruption that can occur at the client or server side of the backup process.

Unauthorised listeners on the network, employees with malafide intent or even ignorant personnel can cause a good backup to go bad. While most cloud vendors attempt to prevent hijack of data during transmission and make an all out effort to safeguard customer data from malicious attackers, no system is absolutely hacker proof. Wise users will insist on maintaining multiple copies of their backup as insurance against possible corruption of any one copy.

Data replication, data mirroring are done at the server end of the chain by cloud vendors to ensure high availability and security of customer data. Many cloud vendors encourage their customers to maintain a local copy of their data in addition to the offsite copies that they create. Many vendors offer local backup devices as part of their package. The client based software creates a local copy of the data on the onsite device even as a cloud based copy is being created on the remote server.

We, at Backup Technology, understand the security needs of our customers. Our software logs every activity that is performed and backup failures are instantly reported. The continuous backup option enables the backup to automatically resume after power is restored, while a failed schedule can be completed by manually invoking the backup process. Our encryption, decryption and compression algorithms are well tested and proven. We replicate, mirror and maintain customer information in multiple servers that are geographically disbursed to ensure high availability and disaster recovery.

Our Customers

  • ATOS
  • Age UK
  • Alliance Pharma
  • Liverpool Football Club
  • CSC
  • Centrica
  • Citizens Advice
  • City of London
  • Fujitsu
  • Government Offices
  • HCL
  • LK Bennett
  • Lambretta Clothing
  • Leicester City
  • Lloyds Register
  • Logica
  • Meadowvale
  • National Farmers Union
  • Network Rail
  • PKR

Sales question? Need support? Start a chat session with one of our experts!

For support, call the 24-hour hotline:

UK: 0800 999 3600
US: 800-220-7013

Or, if you've been given a screen sharing code:

Existing customer?

Click below to login to our secure enterprise Portal and view the real-time status of your data protection.

Login to Portal