Data is becoming increasingly unstructured. While data is reproduced through several means, including in devices such as laptops, desktops, and smartphones and other devices, data consolidation and definition has become challenging. However, such data should be merged, defined and even made available to the people that need the information for the purpose of decision making. So far, data consolidation either never happened or happened very infrequently among all verticals. This has forced decision markers to guess at trends and reach at conclusions based on the available data. However, today things are different and data consolidation and definition can be handled rightly.
There can be changes in the face of computing around your company due to the incorporation of cloud computing. There are also possible things you can do with your data banks which include, centralizing them and linking your mobile devices to the data banks for the main purpose of either downloading of uploading information on the go. You can easily define polices of your enterprise for the mobile employees and automate the process of data backup from plethora of systems online. This is to make it easy for collection of data, categorization, indexing and presentation of data notwithstanding the fact that it is generated or even within a short period of time.
Loss of mobile devices should not be disastrous. There are ways you can prevent data breaches. For instance, you can run your entire mobile devices data on a very “thin” client; and and also store all your information in an Internet based repository. As a result, there will be no impact due to the loss of your mobile device, as it will just be a loss of a piece of hardware instead of a loss of hardware and important information. Encrypting all of the data on the device will give you an added security; and crooks or unauthorized persons that get hold of your mobile device will not be able to gain access to the data in the device or even in your Internet based storage vaults without having necessary authentication.
Mobile access and computing has brought a paradigm shift in the manner in which businesses are being carried. Thus far, there are still people that are still questioning whether mobile computing is necessarily “ready” when it comes to anchoring data that are mission-critical. No doubt about the fact that there are lots of unchartered and unexploited areas available today. Before deciding to take up the leap, you will need to consider and evaluate the possible risks that could cause your enterprise. In addition, mobile computing architectures need to reach its maturity stage, and the applications are what you need to scrutinize before going ahead to approve that it is risk-free to increase your application platform and migrate to mobile computing. Application developers and system architects could face glitches and confidential expectations when it comes to compliance with statutory privacy requirements.
Hopefully, the above discussion has made you aware about the facts when it comes to anchoring data that are generated on mobile services.
Due to data breaches, protection of personal and health information has become a vexing issue. Numerous organisations, including health care industries have lost sensitive data. The data typically includes details of vendors, patients, staff, health id numbers, contractors, etc. When such data loss happens at a hospital, the hospital in question usually apologizes for the inconvenience that staff and patient have faced due to the data breach. In some cases, they try to shift the responsibility to some other entity, claiming that the data theft was a “result of negligence by an outside contractor” that was initially hired as an “expert” in handing sensitive data.
But, does shifting blame to a third party right? Third party companies are selected due to their surety to store and handle sensitive data properly to begin with. They make their living handling such data and it is not in their best interest to lose any data.
To gain the trust of affected individuals, some vendors who lost data due to breach, take the responsibility of providing timely information and offer credit monitoring services for the affected accounts. Providing these services shows that the company has taken the responsibility and acted on it to calm down individuals, who are worried about their sensitive data.
While the vendor has acted to address its responsibilities to communicate affected accounts according to legal mandates and federal regulations, the fact is that sensitive data, including identities have been stolen. It is annoying that theft of information will impact on affected parties for a longer period of time. There is the possibility that the affected parties can sue the organization for negligence for a millions of dollars. Such type of incidents raise questions about data security and precautions against data breaches.
• Is it good to share sensitive information with third parties for data storage?
• How do third parties give assurance to organizations that data will be protected and will never be accessed inappropriately or misused?
• What is the liability of a third party for the data in their custody and what type of charges can be applied when information is misused?
Though the answers of these queries are not easy, the popularity of cloud storage services, as third party service providers, has brought these questions to the forefront.
Enterprises trusting their data to third parties must make an effort to ensure that the data is safe and secure. Enterprises should spend their time and energy to weigh up the reliability of the third party and their data protection claims. Here are some questions that can help in searching suitable third part cloud storage service:
• What is the method of data storage in repository?
• Is the encryption methodology certified by a reliable authority?
• How do people access sensitive data and who has access to the data?
• What are the liabilities and rights of an organization in case of data breaches?
• Does the vendor share sensitive data with anyone? If so, with whom and why?
• Does the secure cryptographic mode of data security are really impregnable or not?
• Does assurance of sensitive information protection check in veracity by service vendors?
• Does the vendor take the responsibility of data protection and guarantee of data breaches due to negligence?
When your company gets the answers of these questions, it becomes easy to evaluate your service provider and their security protocol. Answers to these questions will help in understanding the level of data security and selecting the suitable service to protect sensitive information.
Cloud computing is not free from risks. Smart organisations recognise the risks and deal with them. If your organisation is migrating to the cloud and is worried about security, it is a healthy sign. You are gearing up to identify possible risks and deal with them. We have listed a few commonly talked about risks below for starting you off on your journey of discovery.
Risk #1 – Data security. This is a real concern. You are entrusting your data to a third party cloud vendor. You may or may not know where your data will actually be stored, who will be given access to your data and how many times and on to what servers it will be replicated for redundancy and high availability. You do not have to take this risk blindfolded. You can get your vendor to spell out the details for you. You can evaluate the level of risk involved and take steps to ensure that your data remains your intellectual property, and will not be compromised in any manner by the activities of your cloud vendor. The good news is that most cloud service providers are ready to provide you with any information you may want in this direction and even willing to commit to keeping your data secure from unauthorized entities. You can retain control over your data by employing “user defined” encryption keys to encrypt your data on their storage vaults.
Risk #2 – Integration APIs require validation. True. Cloud services provide standard APIs. Customers using the cloud service must evaluate these APIs for any flaws and understand the extent of risk involved. Ask your cloud service provider all the right questions about the APIs and also about the underlying infrastructure sharing protocols, so that you are fully aware of how and where your data is stored and how accessible it is to others who are sharing that same infrastructure over the cloud. With the right tools and technology, you should be able to address the risks involved.
Risk #3 – Cloud can complicate IT budgeting. True, if you have not centralised your cloud service purchases, your budget could get a bit complicated. If multiple branches of your organisation are purchasing services from multiple cloud vendors, the risk of disintegration is great. Budgeting can get complicated, and confusion can prevail. Ensure that cloud service purchase is centralised and all your branches ride the same cloud. Centralizing your cloud services could even safe you money as the provider could add additional services as a package for smaller fees. Centralised management is convenient as usage can be monitored and users can be tracked from a single window interface.
If your data is burgeoning and your volumes are becoming unmanageable, it is time to get yourself some scalable storage. Virtualisation, no doubt is a first step, but it is just that—the first step. You need to take the logical next step and move to the cloud.
The cloud is designed to handle unstable workflows. The peaks and troughs in your data flow need no longer bother you. You can scale up your storage when data volumes peak, and scale it down when your data volumes dip, and you need to pay only for what you use! If you pause and consider the implications of this, you will appreciate the flexibility you gain thereby. Your legacy systems never allowed you to enjoy this luxury. They were designed for unchanging workflows and you had to spend hours anticipating the peaks and troughs in workflows and provisioning for the same. You had to maintain redundant resources to ensure that you did not find yourself short when peaks were encountered.
Die-hard fans of legacy systems will be quick to point out that there must be some kind of trade off involved. Perhaps, scalability will be provided at the cost of performance? Well, no. It is the legacy systems that create trade offs with performance. Legacy systems have monolithic controller architectures. When resources are shared, all the applications try to hog the available resources, creating noise and performance dips. If resources are to be dedicated to performance sensitive applications, a management nightmare is in the making. Resources will have to be hard wired to specific applications while other applications are starved for resources. The process can also result in storage fragmentation. Cloud architectures are designed to dedicate and release resources on demand. This results in optimum utilisation of resources and makes the resources available for other applications, as and when the demand is made. Storage fragmentation is never an issue as storage is never fragmented or permanently dedicated to any one application.
Cloud resource scalability extends to security scalability. Data is always encrypted, isolated and in synchronisation with standard security requirements. Comprehensive security can continue to be provided even when hardware resources are scaled up or down. This is in distinct contrast with legacy systems, where security cannot be scaled up or down on demand, and additional security can only be provided if physical resources are added to the data centre.
There was a felt need for standardised practices in cloud management as public, private and hybrid clouds gained popularity. Over the decades, a few standards have emerged and there are many more in the pipeline. Reputed cloud service developers and enablers like Asigra constantly update their software to include new tools that help IT managers and CIOs retain control over their data, maintain security of the systems, and manage their users efficiently. Here are a few cloud tools that help IT managers in their quest for data control.
The cloud democratises computing. While this is good for business, it creates a few headaches for the management. For instance, mobile and remote users can upload or download data from wherever they are, with whatever device they have on hand without obtaining permissions from the IT Administrator. The security of the information, the type of applications in use on the connecting device, and so forth, can create security problems that the Administrator must anticipate and provide for. It follows that there is an urgent need for policies and procedures that enforce access boundaries and user permissions, and for tools that enable the Administrator implement these policies and procedures.
A number of cloud service providers use software agents with administrative dashboards that equip the manager with the necessary tools for creating and managing users, who have the necessary company-defined rights and permissions to access the network, and perform some or all operations on the data that they access.
The cloud entrusts data to third party servers. As a consequence, IT managers worry about security. The cloud addresses a few of these issues by provisioning for layered data security. Most cloud service providers use third party (FIPS-140-2) certified cryptographic algorithms to encrypt data. These algorithms are often described as bank grade or military grade and generally use AES 128,192 or 256 or Blowfish that have proven to be impregnable to date. The symmetric keys that are used are often user defined and private keys that can remain secure with the data owner.
The third party service provider does not have access to the content of the data store that is hosted on their cloud server as a result of the encryption. Security and availability of data is further strengthened with the institution of “as is” replication and disaster recovery systems and guarantees that the information will not be accessed by the service provider or their associates at any time. Managers can recover or purge the information contained in the cloud service stores at any time they wish to rescind from the contract using tools provided for the purpose.
Public clouds are suspect—irrespective of whether or not the suspicion is justified or otherwise. Hence, the adoption of the public cloud has been slow. But, the change is becoming visible, as more and more concerns about the public cloud are addressed, and the public cloud assumes its rightful place as a mode of computing that adds value to the business.
What is the value add that is to be obtained from public clouds? The value add from public clouds is in direct proportion to the commitment the organisation feels towards managing the cloud provider and employing the cloud solution responsibly and effectively. In other words, the responsibility for the success of the public cloud rests with the organisation and not with the cloud vendor.
If this seems to be counter-intuitive and contrary to all that you have heard about the cloud, it is the truth. Public clouds do decrease costs and do deliver all kinds of benefits to the end user. But, it brings with it a number of responsibilities:
- IT professionals within the organisation must stay with the cloud and its implementation. They must make the effort to understand the terms and conditions of the contract and enforce any remedies that may be built into the contract to ensure efficient performance of contact by the cloud vendor. If the public cloud performance is poor, the IT personnel within the organisation are to blame.
- The objective of the public cloud is not just backup and recovery. There is a whole gamut of activities that happen in between. Establishing the metrics and monitoring performance is a business imperative for IT managers. Unmonitored public clouds can cause untold difficulties for end users. Latency, seek time issues or even backup and recovery issues may plague the organisation and make the whole experience of the cloud unpleasant.
- Availability and security are promises of the cloud vendor. But, untested security can be dangerous. IT managers will have to repeatedly test the security systems and run disaster recovery exercises to ensure that everything promised is deliverable and can be delivered at the appropriate time and at the pace required.
- Nothing can be managed without appropriate tools. IT managers need to ensure that the cloud service provides the managers with the right tools for the right tasks. There should be tools for scheduling backups and recoveries. There should be tools for managing users, stores or archives. There should be tools for generating and analysing reports on user activity or system activity. Finally, there should be tools for verifying service level agreements (SLAs) and implementations.
It should be remembered that Cloud service providers do not understand your business. They only understand their own business. It is up to you to make sure that their tools are used to your benefit.