“Data Search” and e-discovery are terms that spring up often and often in the context of compliance to legal mandates. How does cloud computing facilitate the process? Here are a few insights into what cloud service providers do to help customers keep their data discoverable for legal purposes.
Cloud computing challenges all assumptions about physical boundaries. This increases the need to protect the data and secure the privacy of individuals who entrust their personal information to business entities. This drives the need for new types of metadata that can ensure data integrity, security and privacy of the individual in an increasingly virtualized environment. The new metadata must create new types of relationships with content in an environment where sharing is integral to computing. New supports and safeguards must be evolved as new dangers emerge and evolve if legal mandates have to be fulfilled.
Electronic data in the cloud may take many forms. These include evidentiary material in the form of telephone logs, transactional data, emails, instant messaging traffic, all manner of corporate communications, documents, records, video and sound recordings, and any other kind of relevant textual or non textual information. All this material must be securely stored, easily accessible, formally indexed, uniquely identified and e-discoverable. There must be a common, flexible, extensible framework that facilitates deduplication, conversion, evaluation, summarization and storage of this information in retrievable electronic formats.
Cloud software systems store all the above information in encrypted formats and generate metadata containing document properties for ease of discovery. Metadata is data about data and is a time tested system for data discovery in the world of computing. Complex metadata is becoming the foundation for targeted advertising as well as mandated legal data discovery systems. Analytical support for this process may include limited entity extraction (eg email header) for pattern matching purposes and to cluster documents according to identified topics. Review solutions may permit user defined document sets with time based selection controls. Text analytics may discern important entity relationships and extract these features for classification and analysis of documents in storage.
We, at Backup Technology, believe that the legal discovery challenge is not in the data storage, but in the generation of metadata. Millions of users with trillions of files and billions of generations can recover the smallest bit of data from the cloud server if the metadata is properly generated and securely stored in the system. Backup Technology has proven the potential of its technology by ensuring that the right kind and quality of metadata is generated every time a bit of data is uploaded to its servers. Our team knows how it can be scaled up, reliably stored and quickly discovered.
Experts state glibly: “SaaS is ready for you, even if you are not yet ready for SaaS!” But, what does “being ready for SaaS” imply? More important “What is SaaS?” Let us answer the second question first.
SaaS is the abbreviation for “Software as a Service”. SaaS service providers deploy industry specific or generic web browser based applications on a subscription basis, over the Internet, to multiple enterprises or employees within an enterprise using shared public/private/hybrid cloud architectures.
SaaS readiness enforces due diligence for functional fit and data strategy. It highlights support requirements and draws attention to economics of the cloud. It is evident that SaaS readiness has a potential to transform thinking on information technology and to create a service centric approach to computing within the organisation.
Functional fit due diligence begins with an understanding that SaaS applications are built on generic business concepts and the process of getting the business SaaS ready presupposes that the business processes will be subsumed to generic pre-defined processes albeit compromises. However, the SaaS typicality can be cost effective and process standardisation may bring in reshape the experience curve for the enterprise. The opportunity cost of using enterprise expertise elsewhere may far outweigh the cost of employing them in designing on premise, differentiated applications.
Due diligence for data fit ensures that the criticality of data conversions and system interfaces are not underestimated. Getting ready for SaaS may involve getting the enterprise familiar with unfamiliar challenges. SaaS tools are typically Wizards that guide the user through a task and are generic constructs. Extraction, transformation and load options may be limited and overnight conversions may present difficulties that are associated with bandwidth availabilities. However, if the enterprise is looking for integrative processes, SaaS tools are most suitable. They blend with diverse systems such as Oracle and SAP or heavy duty enterprise resource planning software.
SaaS readiness acknowledges that the benefits of SaaS are important for the organisation growth strategy. IT infrastructure abstraction is not the only reason. The enterprise can now shift the risk of software acquisition and convert IT from a reactive cost centre to a value generating catalyst of growth. They can take advantage of SaaS Continuum in exchange for a small fee that can be budgeted for as operating expenses rather than as capital investments with long run implications. It is a decision point where political, technological, financial and legal considerations come together for the betterment of the organisation.
Security threats are changing. They are becoming more persistent, virulent and debilitating. But strategies to control and counter these threats are also changing; evolving.
Two APTs that created ripples in the recent years are RSA SecurID Hack and Operation Aurora. Unfortunately, both these were state sponsored threats and cannot be classified with the normal types of threats that are faced by organisations in the course of computing over the Internet. RSA SecurID Hack is an APT that was released in 2011. This attack compromised systems that used RSA SecurID two factor authentication tokens to generate one time passwords.
Operation Aurora was an APT that stole sensitive intellectual property along with source codes from computing Giants like Google; Adobe. The attack was very sophisticated, coordinated and orchestrated. The attackers had immense technical skills and an ability to take advantage of weaknesses of the target organisation. The attacks also, are not short term with aim to capitalise on temporary windows of opportunity. They were threats that exploited vulnerabilities that had not yet been identified by the organisations themselves and were designed to unfold over a period of time (spanning years) using multiple vectors; combining a number of security breaches.
As a result, any traditional methods of securing the organisations data stores, fails in the face of an APT. Alternate strategies will have to be discovered and implemented. The security strategy will have to be more proactive and have the capability of detecting and preventing an APT even as the perpetrators attempt to reconnaissance the organisation for weaknesses.
Organisations and cloud services may have to institute a layered security. The layering will have to begin at the Perimeter. Shared accounts will have to be managed effectively by encrypting and securing passwords; creating complex passwords that are difficult to break; restricting access to administrative accounts and preventing password sharing by automatic login.
The next security layer should include server hardening. Server hosts should be protected with firewalls and definitions of high risk applications for exclusion. Sessions should be recorded; examined and unusual activities should be instantly highlighted for deeper investigation. Analytical tools should be made available to evaluate and examine these activities and track the time, date, source IP and user ID of the login. Phishing protection; anti-virus installation and employee education should follow.
In short, “defense in depth” security concepts should be implemented.
For every business that has signed up for the cloud, there are hundreds who have not. Some of them are still sitting on the wall trying to obtain assurances that the cloud is not the gateway to disaster. The faux pas with cloud computing, frequently reported in the news, is not helping any. Most are unwilling to let go the familiar in IT and step out into the unknown—whatever the advantages that are obtainable.
Fence sitters must introspect. What holds them back from taking the plunge are concerns around reliability and security in the cloud.
Remember, security worries the cloud vendor as much as it worries the business person running the company. Cloud vendors know that clients will refuse to entrust their data to them if security is compromised in any way. Consequently, elaborate security systems are devised to give their customers the confidence in the service. Complex algorithms are created, tested and deployed to encrypt all information at source, so that data can never be hijacked or compromised during transmission or in storage. User management systems are integrated into the agent software to empower Administrators configure users and user rights and permission from a central client-located console; in keeping with the policy dictates of the organisation.
Cloud vendors are extremely disaster conscious. They cannot afford to have their data centres destroyed by natural or man-made causes. They proactively create hot sites, disaster recovery sites, and backup sites for customer information. They iteratively work on their disaster recovery plans, so that customers are never made to feel the impact of “downtime” or loss of service. They physically and electronically secure access to their data centres with elaborate security systems including biometric entry, spy cameras and access logs. Unauthorised intrusions at a physical or electronic level can be instantly detected and dealt with in well constructed and secure data centres.
Well established cloud service vendors are anxious to provide state-of-the-art support services their customers. They know that this is one means of differentiating themselves from the competition. They hire the best available talent in the market to man their support teams. They proactively report backup and recovery status and ensure that they are constantly available to the customer in myriad of ways.
We, at Backup Technology, understand that trust is the foundation of a good business relationship. We work towards building trust and deliver what is promised. The fact that our customer base is growing in geometric proportions is an indication of our market reputation, and the reliability of our services. If you would like to know more about us, please visit our website. You can also download and try out the fully functional trial version of our software into the bargain!
The what, where, when and how of the cloud are persistent questions that must be answered correctly if a cloud deployment is to be successful. But, misconceptions can be handicaps and organisations often labour under a number of misconceptions. A little understanding of cloud architectures, management and chargeback can be useful in selecting the best fit solution to their needs.
Managing and using different cloud architectures
The cloud has evolved from the convergence of a number of technologies and approaches to computing. The underlying architecture is similar to and different from existing computing models and impacts on the operational and technological approaches to network configurations and security practises. Like all computing systems operating over a network, the cloud consists of a back end [the remote server(s)] and a front end (the client computers). The connecting network is the Internet. The servers, the applications and the storage devices at the backend provide a cloud of services to the customers. Cloud computing systems that cater to multiple clients are known as “public” clouds. When an entire cloud service system is dedicated to a single client, it is known as a “private” cloud. Hybrid clouds combine features of the public and private clouds.
The client machines connect to the remote server(s) and the applications using software called an “agent”. The agent is a special kind of software, known as middleware. It enables IT Administrators monitor traffic, administer the system and set rules and regulations for access and use of the information stores available in the remote server.
“Utility computing” is the unique selling point (USP) of the cloud. Organisations signing up for cloud services agree that the cloud makes it easier for the organisation to track and measure IT expenses per business unit. Chargeback becomes simpler as it is metered like electricity on a “pay per use” basis.
Chargeback mechanisms in the cloud take into consideration two factors:
What are the resources and metrics for chargeback?
How to account for excess capacity that is supplied on the fly?
The chargeback system is built on the assumption that customers tend to use average capacity rather than large capacity and hence offering scalable services does not automatically result in extensive usage of resources. Further, cloud vendors understand that successful chargeback systems separate infrastructure costs from service costs and that shared infrastructure is a combination of fixed and variable costs in which the percentage of fixed costs will decrease as number of users increase. Pricing will consequently, be, unit tiered; bundled or pay per use.
Collaboration has been around for two decades now. Yet, companies who have looked over collaboration options available over the LAN, exhibit unease around the concept and seem to prefer to use email for collaboration rather than collaboration tools. But, email has its limitations as a collaboration tool. Users can collaborate with only one user at a time and a number of versions of a document may be generated as multiple users work independently on the same document and return it to storage server. The process of shuffling the documents back and forth between the users can also hamper productivity and may even prove deterministic as it allows for only one working style.
The cloud has empowered collaboration by transforming Outlook and Word into collaboration tools. The barriers for entry such as large monetary investment in collaboration tools have been removed and the pay as you go model ensures that the tools are just a click away. SharePoint servers linked to the cloud become all the more powerful as partners and contractors integrate into the workflow and remote branch office no longer feel isolated from decision making as they have access to the same tools.
Interestingly, collaboration is accommodating and absorbing a number of emerging devices/ communication modes into workflows. The mobile worker, the smart phone, the iPhone or laptop user can quickly and easily link up to the network and collaborate with their colleagues stationed across the globe. IM, Facebook, LinkedIn and other modes of communication add on to the collaborative process, creating a dynamic, buzzing business environment.
This multiplicity of collaboration tools has naturally highlighted and intensified questions around security. But, it must be remembered that the focus on security is greater with mobility and cloud vendors have made security enhancements to meet these new challenges. Moreover, documents passed around for collaboration are often hedged around with access rights, roles and privileges unlike email documents which travel as mere attachments to the mail.
We, at Backup Technology, believe that collaboration will be the way of work in the future and have strengthened the collaboration tools that are available with our software. Collaborating with the right secure software is a low risk, high reward option. You are welcome to test drive our product. Sign up for a trial and check out how our product, helps your organisation capitalise on the new opportunities that open up with collaboration.