Tag Archives: Cloud

Efficient, Stable De-duplicating Processes

Storage needs are ballooning. Data volumes will soon overwhelm organisations like a never receding Tsunami if nothing is done about it. There are very few choices. Organisations must:

1. Get more storage space;
2. Archive / take offline data that is no longer relevant; or
3. Compress the data stored.

While falling disk prices and innovative storage systems have made it possible to hire more disk space or archive data effectively, it is data compression that has received a lot of attention in recent years. Compression not only saves disk space, it also saves bandwidth required for transmission of data over the network. Data compression includes data de-duplication and is relevant both for data storage and data archival.

Disk based “de-duplication systems” compress data by removing duplicates of data across the data storage system. Some implementations compress data at a ratio of 20:1 (total data size / physical space used) or even higher. This may be done by reducing the footprint of the versioned data during incremental or differential backup.

Vendors use a variety of algorithms to de-duplicate data. Chunking algorithms break the data into chunks for de-duplication purposes. Chunking can be defined by physical layer constraints, as sliding blocks or single instance storage algorithms. Client backup de-duplication systems use hash calculations to evaluate similarity between files for removal and referencing of duplicates. Primary and secondary storage de-duplication designs also vary. While primary storage de-duplication is directed towards performance optimisation, secondary storage is more tolerant of performance degradation and hence de-duplication algorithms are constructed with more leeway.

Until recently, data de-duplication was only associated with secondary storage. The increasing importance of the cloud as a primary storage has created a lot of interest in de-duplication technologies for primary storage.

We, at Backup Technology, offer an agentless cloud backup service powered by Asigra. Our software de-duplicates data at source and transmits only the compressed, bandwidth efficient incremented or differentiated ‘bits and bytes’ or blocks of the backup set, over the network. Global level de-duplication technology at source is also available for the hybrid versions of the cloud for local backups. The comprehensive solution is non-invasive and ideally suited for small, medium businesses, as well as enterprises. Why don’t you try out our software before you decide to commit yourself?

Real Time Transaction Monitoring in the Cloud

Critical ‘business-events’ are actions performed by the business. The transactions between two independent entities entering into an agreement to exchange items of value — such as purchase or sale of goods — are recorded as business transactions. Accuracy, integrity and accessibility of this information in real time are of immense importance to the business. How does the cloud make this possible?

A typical enterprise application provisions for the storage and access of transaction information in a database. However, in most enterprises, the database integrity is compromised in several ways. Multiple application programs may connect to a single database for commitment of information. Database locks generated by one application may prevent the updates initiated by other application resulting in data loss. Power shutdowns may result in a failure to commit information to the database. Consequently, the accuracy, currency and reliability of information may be questionable.

Cloud-based database applications are ‘uniform’ constructs. The client application and the web browser based applications are alternate versions of the same software constructed with the same business logic. In most instances, all functions that are available in the client-based version of the software are available in the web version of the software. They do not perform any operations or actions that will compromise the integrity of the database at any point in time. The transaction database will remain consistent, accurate, reliable and current at all times.

Real time transaction processing in the cloud will emulate the business transaction. The application will register the start of the transaction, and will identify the appropriate tables that need to be updated. It will also perform the update, record the update in the log, and then commit the transaction to the database. If even one of these steps fails, an alert will be generated and the failure will be registered in the log. If a rollback settings option has been selected, the database will be rolled back to the pre-transaction stage and an alert will be sent to the user.

IT personnel needs a coherent and reliable means of recording real time transactions in the cloud. Our software constructs for the client and the web make sure that your data remains consistent and integrated at all times. The software allows correct recovery from failures with rollback of database to original pre-transaction state. Users can choose to record all database activities on the log and to be alerted immediately if a transaction fails to complete.

Run your business without boundaries confident that Backup Technology, Powered by Asigra, can meet the ACID (Atomic, Consistent, Isolated and Durable) test of real time transaction processing head on!

Keeping Costs Under Control

The traditional ontology of costing and micro theories of costing may need to be redefined in the virtual world where value adding activities may be less visible except in areas of strategic alliance, information sharing or online payment mores. The theories may have to be generically deployed to achieve cost management irrespective of whether the enterprise belongs to the manufacturing or service sector. The criteria for evaluation may include generality, efficiency, perspicuity, transformability, extensibility, granularity, scalability and competence.

So, how does one keep costs under control? The process must logically begin, by recognising the characteristics of the cloud and identifying the appropriate / suitable costing methods for delivery and use of services.

• The cloud consolidates the distributed enterprise and shifts all activity–online–to the remote data centre in the cloud. This largely precludes the possibility of pinpointing
specific activities to specific cost centres.
• CAPEX vanishes with the cloud and is replaced by OPEX. Direct expenses suddenly assume the aspect of indirect expenses.
• Logistics costs and information management costs form a large chunk of the costs for the virtual enterprise.

Consequently, cloud vendors adopt use-based, granular costing methods for estimating the costs of maintaining a cloud and servicing the customer. There is a defined base cost and a top up cost for the service. While base costs are fixed, the top up costs vary in accordance with the level of usage of the services. The secret of cost control, in context, is simply to monitor the amount of base plus top up costs or to evaluate storage space or application use of the different participating departments and define limits of usage for budgeting purposes.

However, the process is not as simple as it seems. The implication is that enterprises will have to get more sophisticated about shaping the use of resources within the enterprise. They will have to recognise utilisation risks and the management will have to play a significant role in ensuring sufficiency of usage and financial viability of the use. Cost saving opportunities will involve using cloud based applications only where the use makes business sense and scaling up operations in the cloud only when there is a peak transaction load. Since pricing in the cloud is directly proportional to utilisation, IT financial talents must be made conscious that IT operations must become more business like in their dealings if costs are to be kept under control in the cloud. IT Administrators must lock step with business growth and success and optimise on use of cloud based IT resources to advantage the business.

We, at Backup Technology, are always ready to work with our customers in evaluating the economics of using the cloud. Our cloud services are designed to optimise resource use for our customers. Our customers remain informed on usage data and have the option to scale up or scale down usage on-demand. Alerts are generated when usage boundaries are reached. Compression, de-duplication and archiving technologies are used to help our customers achieve cost savings in the cloud.

Conceptualising Cloud Based MIS

Conceptual design of an MIS is the output of an interactive, highly focused discussion between the business managers and IT professionals. It is a high level definition of the MIS objectives, guiding policies and constraints with reasoned consideration of viable inputs, storage, outputs, communication protocols and business processes for generation of alternate MIS designs and the selection of the best fit design for the organisation.

Input for MIS may be received from external sources or internal sources. For instance, a steel manufacturing company may receive inputs on market price of iron ore; cost of transportation etc from external sources. They may have information about iron ore smelting, cleaning and processing times from internal sources. They may have some intellectual property—a formula for extracting the iron from the iron ore cost effectively. The business managers and the IT professionals will have to decide how they will integrate the information received from these different sources and how they will communicate the re-ordered information with employees at different levels of the organisation.

If the organisation has a number of branches scattered across geographical regions, the MIS design will have to give a serious consideration to whether the data should be centralised or distributed. Both kinds of databases have their advantages and disadvantages will have data retrieval impact. The time to access; the speed of access; latency issues etc will determine how the organisation wants to make its data available to its employees. The volume of information available or generated by the system will impact capacity planning and have a role to play in the kind of scalability of system the organisation wishes to deploy. The sequential or relational nature of the information will further determine how the information is organised and made available.

Organisations may process information in batches or record by record. Combination approaches are also not uncommon. The use of sophisticated modelling techniques in information processing may require the use of complex applications such as CAD/CAM and these applications may have to be re-configured; re-engineered for cloud deployment. Simpler applications such as word processing may be deployed with public licensing or shared licensing systems.

Ultimately, the test of the system is in the output. The system design must ensure that the system will be capable of delivering the right kind of output to the right level of employee in time at the right frequency. The output may be visual or verbal. It may be direct or routed through the senior management.

It is important to get the conceptual level of MIS design for the cloud right. It is the basis on which the detailing for the cloud is built.

Understanding Cloud Risks and How to Mitigate Them

Doomsayers will say what they must, but, you need to prepare for the worst and plan to deliver the best of the cloud to your organisation. It is true that there are risks attached to cloud computing. But, these risks are controllable if you know what they are and how they need to be handled. Let us examine some of the risks and understand how they can be mitigated.

The use of the Internet as the network makes cloud computing vulnerable. Several issues are often highlighted in context. First, the APIs used by cloud vendors are not checked for vulnerabilities by cloud users. This creates security problems. Second, most cloud solutions are multi-tenant solutions and vulnerabilities may be introduced at the software level, making the user content pregnable. Third, cloud computing may create several compliance risks, whatever the vendor may claim. Finally, proliferation of cloud subscriptions can complicate budgeting and IT spending and make it difficult to recognise patterns of spending, given the fact that cloud billing is usage driven.

Take a close look at the statements. You will realise that each of these problems can be effectively countered with the right tools. It is obvious that there are a number of measures that can be taken to ensure that any transactions conducted over the Internet are inviolable and cloud subscriptions do not become uncontrollable. All it requires is a little bit of common sense, tons of enforceable discipline, and plenty of planning. This is the most boring part of cloud migration, but an effort that will yield huge dividends to any organisation in the long run.

Companies launching on their cloud venture must lay down clear ground rules. A cloud purchase policy will be a good starting point. The policy must stipulate that all cloud subscriptions must be authorised by a central authority and branches or mobile users cannot subscribe to cloud vendors of their choice. If one centralised authority does all the cloud buying, it will be possible to have a complete and holistic picture of cloud spending.

Further, the centralisation of purchases will facilitate any other risk mitigation activity that the organisation may wish to undertake. For instance, the IT Administrator can examine in some detail the software deployments offered by the cloud vendor. Any multi-tenancy environments can be vetted before they can be accepted. The compliance offerings can be tested before data is transferred to the cloud. The encryption protocols offered can be evaluated for impregnability of the information being transmitted over the Internet and stored in the cloud repository. It will be a good idea to check on the disaster recovery functionalities before the final commitment is made.

The bottom line is: you are responsible for the security of your cloud. Forget what the doomsayers have to say!

Intelligent Processes for Cloud Computing

If you are migrating to the cloud, make sure your processes have been tuned up and made intelligent. Intelligent processes can make your computing experience pleasant and meaningful. It can bring significant benefits to your business. It will be possible to measure your performance and even attribute numerical values to less tangible effects of cloud computing on your business.

The greatest contribution of the cloud is agility. The cloud uses agile tools to automate processes such as Document Management Systems (DMS) and Business Process Management Systems (BPMS). These tools allow organisations input rules and procedures into the system, and align the cloud to business policies on the fly. Process logs that record data as processes execute, allow organisations work with predictive and process analytical tools to understand the glitches and problems that retard the smooth execution of their processes. The results can then be used to improve the processes and reduce the latencies that may occur due to process slowdowns or network problems. Analytical outcomes can also be used to predict future “what if” scenarios and assist in enterprise decision making.

The knowledge worker regards the cloud as a boon. The anywhere, anytime, any device access that is made possible by the cloud, allows specialists apply their special skill and knowledge in decision making from wherever they are. The intelligent process will allow them to see all past and current information in a single window, and make informed decisions about current or future business actions at once. Since all information input is policy driven, the knowledge worker receives information in context, and is empowered to make the best decision possible.

All this results in greater customer satisfaction. Greater customer satisfaction then translates in to greater profitability for the business.

This brings us to the most important question: How can you make your processes more intelligent for cloud computing? Here are a few tips for starters:

1. Begin by modelling your individual processes;
2. Link all processes together into the value chain;
3. Add instrumentation to processes to collect business events information wherever possible;
4. Create analytical visualisations at runtime so that actual process activity can be observed and recorded accurately for future analysis;
5. Implement business rules based on business policies and observe how the rules impact the process flow. Do they retard the flow? Are process latencies increased by business rules?
6. Ensure process security and observe how security impacts process flow;
7. Improve process flows with process analytics and predictive analysis.

Our Customers

  • ATOS
  • Age UK
  • Alliance Pharma
  • Liverpool Football Club
  • CSC
  • Centrica
  • Citizens Advice
  • City of London
  • Fujitsu
  • Government Offices
  • HCL
  • LK Bennett
  • Lambretta Clothing
  • Leicester City
  • Lloyds Register
  • Logica
  • Meadowvale
  • National Farmers Union
  • Network Rail
  • PKR

Sales question? Need support? Start a chat session with one of our experts!

For support, call the 24-hour hotline:

UK: 0800 999 3600
US: 800-220-7013

Or, if you've been given a screen sharing code:

Existing customer?

Click below to login to our secure enterprise Portal and view the real-time status of your data protection.

Login to Portal