All posts by Rob Mackle

Keeping Costs Under Control

The traditional ontology of costing and micro theories of costing may need to be redefined in the virtual world where value adding activities may be less visible except in areas of strategic alliance, information sharing or online payment mores. The theories may have to be generically deployed to achieve cost management irrespective of whether the enterprise belongs to the manufacturing or service sector. The criteria for evaluation may include generality, efficiency, perspicuity, transformability, extensibility, granularity, scalability and competence.

So, how does one keep costs under control? The process must logically begin, by recognising the characteristics of the cloud and identifying the appropriate / suitable costing methods for delivery and use of services.

• The cloud consolidates the distributed enterprise and shifts all activity–online–to the remote data centre in the cloud. This largely precludes the possibility of pinpointing
specific activities to specific cost centres.
• CAPEX vanishes with the cloud and is replaced by OPEX. Direct expenses suddenly assume the aspect of indirect expenses.
• Logistics costs and information management costs form a large chunk of the costs for the virtual enterprise.

Consequently, cloud vendors adopt use-based, granular costing methods for estimating the costs of maintaining a cloud and servicing the customer. There is a defined base cost and a top up cost for the service. While base costs are fixed, the top up costs vary in accordance with the level of usage of the services. The secret of cost control, in context, is simply to monitor the amount of base plus top up costs or to evaluate storage space or application use of the different participating departments and define limits of usage for budgeting purposes.

However, the process is not as simple as it seems. The implication is that enterprises will have to get more sophisticated about shaping the use of resources within the enterprise. They will have to recognise utilisation risks and the management will have to play a significant role in ensuring sufficiency of usage and financial viability of the use. Cost saving opportunities will involve using cloud based applications only where the use makes business sense and scaling up operations in the cloud only when there is a peak transaction load. Since pricing in the cloud is directly proportional to utilisation, IT financial talents must be made conscious that IT operations must become more business like in their dealings if costs are to be kept under control in the cloud. IT Administrators must lock step with business growth and success and optimise on use of cloud based IT resources to advantage the business.

We, at Backup Technology, are always ready to work with our customers in evaluating the economics of using the cloud. Our cloud services are designed to optimise resource use for our customers. Our customers remain informed on usage data and have the option to scale up or scale down usage on-demand. Alerts are generated when usage boundaries are reached. Compression, de-duplication and archiving technologies are used to help our customers achieve cost savings in the cloud.

Saving Space and Money with Data De-duplication

Like every disruptive technology, the cloud is hungrily absorbing and assimilating within itself, a number of minor innovations and utilities. Data de-duplication is one such innovation that has been successfully integrated with cloud technologies to deliver value.

Technology experts are quick to point out that data deduplication is not really a technology. It is a methodology. It is software driven process that identifies and removes data duplicates in a given data set. A single copy of the data is retained in the store, while all duplicates of the data are removed and replaced with references to the retained copy. All files that initially contained a copy of the data, now contains a reference to the data item retained in the store. Whenever the file containing the deduplicated data item is called for, an instance of the data will be inserted at the right place and a fully functional file will be generated for the user. This method of compressing data reduces the amount of disk space that is being used for data storage and reduces costs of storage.

The growing importance of de-duplication can be traced to the growing volumes of data being generated by businesses. As businesses continue to generate data, space becomes a major constraint and financial resources may have to be allocated for acquiring larger storage capacities. Consequently, any technology that allows them to “have the cake and eat it too” is welcome!

Data deduplication can be “in-line” or “post process”.

In line data deduplication is a process that de-duplicates data before it is sent to storage server. This saves on bandwidth and time-to-backup, as the amount of data being transmitted over the Internet is reduced and only the “clean” data reaches the storage server. However, the process of de-duplication at the client end of the system is itself a time consuming process and is extremely resource intensive.

Post-process de-duplication removes duplicates from the data that has been uploaded to the storage server. There is neither saving of time or bandwidth during transmission, but there is certainly a saving of processing time and client hardware resources at the point of transmission of data, since all de-duplication processes happen on the cloud vendor’s server. Modern day backup companies use a combination of the two methods for obvious advantages.

Backup Technology have integrated data-de-duplication with its cloud backup and recovery solutions. The all-in-one suites for cloud computing and online backup automatically provide data de-duplication services to the subscribing clients. The software automatically detects and deletes all duplicate data and creates appropriate references to the data during the backup process. This saves time and money and results in faster time to backup and recover. The extensive versioning that is used in tandem adds to the strength of the software as older versions of any backed up file can be recovered — even if it was deleted from the source computer. For these and other similar reasons, we invite you to try our award winning cloud backup and disaster recovery and business continuity services, powered by Asigra. We are confident that you will be completely satisfied with what we have to offer!

Creating High Capacity Networks with Almost No Investment

It is an undeniable fact that organisations that are growing will experience degradation of network response times. More and more people logging in from more locations, will jam the works and create intolerable wait times for everyone. The most urgent need will be to increase capacity, bandwidth and invest in capital goods that the organisation can ill afford. Managements may baulk at the thought but will have to give “capacity increase” a serious consideration as it will have a direct or indirect effect on the bottom line. The existence of the business may depend on it. They may be forced to divert funds from other mission critical activities to augment their network.

What if, bandwidth and capacity can be increased with little or no investment? The question will be impossible to ignore? If you had been asked this question a decade or two ago, you would have deemed the questioner a lunatic or an ill-informed idiot. You would not have believed that it is possible to create high capacity networks with little or no investment. You would have been incredulous. Today, the statement will invoke very little surprise. Well, the cloud offers a pay as you go model that allows you to increase or decrease the amount of bandwidth and capacity you use in synchronisation with the peaks and troughs of your business. Cloud computing has made “high capacity network” a reality for even small and medium enterprises that are strapped for funds.

Cloud services offer a number of different options to end users. Infrastructure as a Service (IaaS) subscribers can requisition for and obtain additional infrastructure from third parties who make infrastructure provisioning their business. Platform as a Service (PaaS), aims to provide customers with the platforms and space they require for developing their custom applications. Software as a service (SaaS) allows organisations save on costs of deploying and licensing on standard applications that may be used regularly by their employees within the organisation. Backup and recovery is made simple with backup, recovery and disaster management being offered as part of the subscription package.

The use of any or all of the above services is charged on usage basis and can be treated as operational expenses by accounting buffs. All that the organisation needs to invest in is—a high speed Internet connection with sufficient bandwidth to meet their current needs! High capacity networks can slide into place with a mere subscription to a cloud service.

Rationalising VM Backup for Cloud Computing

Virtual machine sprawl is more common problem than believed. Backing up these VMs can become an expensive and time consuming proposition. It is important to take a step back and view the entire process of VM creation and backup / recovery in a holistic manner. There is a need to rationalise VM backup for cloud computing.

Uncontrolled VM sprawl can result in unscientific use of backup resources requisitioned from cloud service providers. It will also be a burden to the IT function. Therefore, decisions will have to be taken at the management level to institute an organisation wide policy that centralises the creation and deployment of virtual machines. A quick survey of existing VMs and the possibility of consolidating multiple VMs should be given a serious consideration. Unnecessary and rogue VMs should be deleted before the cloud backup process is initiated. This will help the organisation avoid the backup of stealth VMs and exhaustion of computing resources indiscriminately.

Virtual Machine rationalisation policies must include strict procedures for creation and deployment of virtual machines by anyone in the organisation with no exceptions. There should be strict procedures for implementing production system without appropriate approvals being sought and given. Backup needs should be clearly defined by the VM and data owners. The defined need should be subjected to an impact analysis and there should be clarity on what would be the consequences of backing up or not backing up the VM.

Rationalisation of VMs for backup and recovery in the cloud will only be possible if IT Administrators are equipped with the right tools to evaluate the virtual environment and arrive at the right solutions to the problem of VM sprawl. Monitoring tools should be made available to them in order to ensure that the environment continues to remain clean, and rogue VMs are not introduced at any point into the environment without appropriate approvals being given. All these efforts will help IT Administrators keep a tab on raw and provisioned storage resources, and ensure that resources are not wasted indiscriminately by anyone within the organisation by deploying unauthorised VMs. This will protect the organisation and help in quick recovery of digital assets in the event of disaster.

Getting Smarter with Cloud Computing

Complete automation is a myth. Absolute agility is a dream. But, the cloud makes it possible to automate those routine processes and activities that would otherwise consume considerable amount of time and deprive the organisation of the precious time that can be spent innovating, communicating, and building up their business.

The first step towards smarter computing is to spell out your rules and policies. These are triggers and frames for intelligent process definitions. For instance, if you want only certain section of your employees to have access to a specified set of data, it is important to have a user management policy. Each employee who can be authorised for access must be given a user id and password that allows access to the data set. The authentication server database must contain the information that is required for authenticating and permitting such employees to access the information. Any other person attempting to access the information will then be automatically rejected and denied access to the data set. Once the policy is in place and the rules of access have been spelled out, the system will take care of the process intelligently.

The cloud allows enmeshing of heterogeneous systems into a single system to increase enterprise reach and improve the agility of the business. This may involve transfer of data and information between these systems across time zones over the Internet. Security during the process of data transfer, and security at the point of data use become a major concern. Cloud service providers use encryption and user management protocols in innovative ways to ensure security of the information passing through the network. Data is encrypted at source and remains encrypted at rest. Only authorised users, who are authenticated by the authentication server, are given access to decrypted information. All others attempting to listen in will be unable to access the decrypted information in any manner. Attempts to listen in also generates alerts that can be tracked to the source.

Organisations that have migrated to the cloud can let go their tight hold on the amount of server / storage resources consumed by individual users. Users will consume only as much resources as they need for the present. The scalability of the cloud precludes the need to provision for and hoard resources against possible future needs. Moreover, users cannot store duplicate pieces of information, indiscriminately consuming space. The backup and recovery software automatically detects duplicate pieces of information and eliminates them during the data transfer to storage repositories.

Interesting? It seems smart! Smart organisations get smarter with cloud computing!

Integration Myths—Are you the Victim?

Are you a victim of misinformation? Have you ignored integration because you believed any or all of the myths that surround the concept of cloud integration? Stop now and have a re-look at what you believe to be true. You may be wrong about the importance of integration.

Myth #1 – Integration is a quick fix solution. Not true. It is quick but not temporary. It delivers recurring value and continuously drives down costs. The reality is that integration puts in place a set of solutions that permits the organisation advantage itself on disruptive technologies. Efficient integration supported by best practices—such as use of application specific APIs or standards, data formats or facilitation of data format management, transformation, logic management and monitoring—help make integration the foundation upon which you can build a successful business. It makes information more accessible at granular levels and increases productivity. Costs are driven down as custom applications are replaced with standardised applications and costs of supporting these processes are brought down. Use of Platform as a Service (PaaS) or Infrastructure as a Service (IaaS) to support business applications and processes further reduces costs.

Myth #2 – Integration in the cloud is time consuming. Not true. Cloud computing relies more on standardisation and modelling than on premise custom made applications. The logic and configuration do not have to be developed anew. They are available and ready to deploy. This makes it faster to adopt and easier to adapt. The learning curve is shorter and less steep. Development time schedules can be rearranged for alternate focus. The integration process itself can be timed and budgeted accurately.

Myth #3 – Integration requires expertise. Not true. The on premise hardware and software can be quickly and easily integrated with cloud services. The setup is uncomplicated, user friendly, and a number of set up wizards are available to guide the user through the process of setup. The configurations are platform independent and do not demand special attention to underlying hardware resources. The integration platform comes with fault tolerance and failover mechanisms. Additional resources or features can be added on the fly and provisioning can be managed with ease by Administrators or invoked instantly. In other words, there is greater visibility into the health of the integration platform.

Myth #4 – Integration does not have a direct impact on the business. Not true. Integration is all about expanding the reach of the business efficiently and effectively. Customers, suppliers and mobile workforces can reach out into your databases for needed information and initiate faster communications. Customers can have more information on command and orders can be processed faster.

Our Customers

  • ATOS
  • Age UK
  • Alliance Pharma
  • Liverpool Football Club
  • CSC
  • Centrica
  • Citizens Advice
  • City of London
  • Fujitsu
  • Government Offices
  • HCL
  • LK Bennett
  • Lambretta Clothing
  • Leicester City
  • Lloyds Register
  • Logica
  • Meadowvale
  • National Farmers Union
  • Network Rail
  • PKR

Sales question? Need support? Start a chat session with one of our experts!

For support, call the 24-hour hotline:

UK: 0800 999 3600
US: 800-220-7013

Or, if you've been given a screen sharing code:

Existing customer?

Click below to login to our secure enterprise Portal and view the real-time status of your data protection.

Login to Portal