CIOs can’t stay away from the cloud anymore. The cloud enables businesses to create new business models and accelerate innovation. The cloud also represents a new type of data management, as more and more companies are leveraging the cloud to implement new IT services for increased availability of their applications and to make the backup process more efficient.
But European businesses that want to use a public cloud run by a large US vendor today, or use any cloud solution that involves data transfer to the US, will be quickly thwarted by the European data protection legislation as well as country-specific regulations.
By its judgment on 6 October 2015, the European Court of Justice has declared the Safe Harbor Agreement between the United States and Europe to be invalid. Europe in this context refers to the 28 member states of the European Union (EU) plus Iceland, Liechtenstein and Norway, as well as Switzerland. The ruling means that companies that previously transferred data from these countries into the US must now find new ways to ensure that their data is processed in accordance with the EU data protection obligations.
The judgment does not give specific transition periods. It also means that the national data privacy authorities need to define their action plan in light of the ruling, which may be covered and unified under the European Data Protection reform. Therefore, IT managers should act now and look for new ways to set up their IT infrastructure in a more flexible way so they can execute distinct privacy compliance strategies and meet future legislative changes.
4,700 US-based multinational companies relied solely on Safe Harbor, including some of the largest cloud providers and storage companies, to document their compliance with data protection rules. While these companies are now scrambling to comply with local laws, a few other US-based companies have taken a different approach to compliance. NetApp went the extra mile. The storage company legally collects, processes and transfers data from the EU to the US according to country-specific regulations as opposed to Safe Harbor. NetApp is authorized to transfer personal data under Binding Corporate Rules (BCRs), which is the strictest level of compliance within the EU that has only been achieved by less than 30 US companies. Also used are model contractual clauses, data privacy agreements, consents, and other lawful data collection, processing and transfer mechanisms.
Data control is the key to success
Companies must ensure that they continue to retain full control over their data despite cloud usage. Only then it is possible to show – in a hybrid cloud scenario – if data is being processed outside the country’s borders. At the same time, the CIO needs to know how and where the cloud provider and any sub-service provider are running their cloud data centers at all times. They must also clearly understand what jurisdictions the data flows through before reaching the data center, and where the data is backed up. Data also often exists in several copies: user, primary or data generated through application may be archived, backed up and replicated for disaster recovery. Location of the data center alone does not address the complexity of data privacy compliance.
Moreover, the CIO must ensure that having adopted cloud services, the company continues to have complete control over data and processes. This applies to personal data as well as other business-critical information such as intellectual property and research or customer data. In addition, some cloud providers utilize third parties to manage their data centers, which can – if this is not transparent to the end user – increase the risk that control of the data gets further away from the control of the company. In this sense, NetApp not only serves as a model for executing a compliant data privacy program. The data management specialist also designed the concept of a Data Fabric including a hybrid multi-cloud data management solution that offers such control as well as data protection.
The concept of Data Fabric
Data Fabric is NetApp’s vision for the future of data management. It has been designed to enable customers to respond and innovate faster as data is free to flow where it is needed most. To fulfill this vision, Data Fabric serves also as technology architecture for hybrid cloud integrating any combination of resources that are on-premises, near-the-cloud and in-the-cloud.
With “NetApp Private Storage for Cloud” (NPS), the company offers a solution to deliver on the Data Fabric vision. It combines local IT resources (on premise) with managed cloud services, and allows the IT infrastructure to use data and data processing (compute) separately. This makes it possible for certain IT services, which are still running under the customer’s full control and in its own data center, to be extended into the cloud, where the data can freely be moved around. For this purpose the customer operates a private storage system in a colocation data center. This data center is located in close geographical proximity to the large public cloud providers to leverage the very low latencies in data transmission over short distances for highest transmission rates at a network level. This colocation infrastructure enables to separate storage and compute, which is a key factor for being compliant to local data protection rules.
From a data protection point of view, such a hybrid cloud solution allows processing data out of the cloud instead of in the cloud. Data remains in a storage system in the company’s full control. The public cloud services used, for example, computing, applications and other “as a”-services will engage at certain times and use this data. The applications started in the cloud can change data or generate new results based on the data stored in the colocation data center. However, the company is only using the computing power offered by the cloud provider, but no data is transmitted permanently to be stored in the cloud. In the legal sense, therefore, there is no “transmission of data” into the cloud. The data always remains within the customer’s IT infrastructure.
Privacy advocates could now argue that although no data has been transferred into the cloud, contracted data processing still takes place. In a conventional use of cloud resources this remains true, since these services are organized in a similar way to an outsourcing model. The process with an intermediate colocation provider, however, uses the cloud as an extension of your own IT resources and without the explicit transfer of the data to a third party. This is akin to blind data processing, in which the IT processes are started by the company and consistently controlled. The data processing is done with volatile processes and predominant use of memory. This allows the consideration that a company brings the computing power into its own area of control, and does not use the cloud as an outsourcing service.
The decisive criterion here is the control of the data processing. The cloud data center is limited to ensure the operational data processing and to keep records of the period of use. The responsibility for data protection remains with the company.
The type of infrastructure usage shown here is, therefore, not considered as data transfer in a common sense. This is also not so-called ordered data processing, as the customer is only using a technical extension of its IT resources under the company’s sole control. Since the data is stored on the company’s own storage system, complete control is guaranteed. Therefore, this arrangement is not subject to the strict requirements for permissible data transmissions to third-party states and does not entail contracted data processing which must be comprehensively secured both contractually and technically.
NPS is one of the solutions that bring the concept of Data Fabric to life. AltaVault for enterprise-class cloud-integrated backup and archiving or Cloud ONTAP, the cloud-based version of the ONTAP storage operating system, are other components. The Data Fabric creates a hybrid multi-cloud solution that utilizes cloud resources while simultaneously considering requirements related to mobility, availability, security, confidentiality and a continuous control of application-relevant data. The mobility of data, in particular the ability to restore it, is guaranteed throughout the entire duration of the processing from the “cradle to grave”: from the first “computing job” out of the cloud to the return transmission and “residue-free” deletion from the cloud provider’s systems. It combines the power of the cloud with the requirement of untouched data control. This combination offers a new, secure, flexible and compliant home for data.
By Dr. Dierk Schindler and Sheila FitzPatrick