The following licence is applicable to this document:
Creative Commons Attribution 4.0 International Public License
This report presents the results of a survey conducted by E-Space into the standards and standardisation activities for cloud services. The survey was commissioned by the Standardisation Forum and conducted between September and December 2023. The report provides an overview of European, international and national standards, and the standardisation activities that are relevant to cloud services, systems and platforms. The research also identified ‘existing gaps’, i.e. areas where open standards for cloud services are needed but do not yet exist or need more robust policies. The report is structured as a pyramid: it starts with the conclusions and recommendations. These aspects are then explained in greater detail and substantiated in the rest of the document.
This is the final draft for discussion by the Standardisation Forum.
The Standardisation Forum advises on the obligatory or voluntary adoption of open ICT standards in the public sector. To be able to advise the government on standardisation in relation to the cloud, the Standardisation Forum’s aim is to get an idea of the standards that are important for government policy on the use of cloud services. To this end, the Standardisation Forum Office commissioned the independent consultancy firm, E-Space, to conduct research.
An important starting point for the research was the letter from State Secretary van Huffelen dated 29 August 2022, in which she announced a change in the central government’s policy on the use of cloud services. The new policy sets frameworks for the central government’s use of commercial public cloud services and updates the former policy dating back to 2011, which focused mainly on keeping cloud services under the government’s own management.
Government policy on cloud services raises questions about digital sovereignty, vendor independence and information security. Open standards for data portability, cloud interoperability and information security can help to support these standards. The main objective of the research was therefore to provide an overview of European, international and national standards, and the standardisation activities that are relevant to cloud platforms, systems and services. Trends and risks were also identified.
The Standardisation Forum formulated the following questions for this research:
What European, international and national standards are there in relation to the cloud, in particular in terms of cloud interoperability, data portability, information security and orchestration?
Which European, international and national cloud standards are still under development or planned?
Are there any existing gaps? In other words, cloud technologies or applications that require open standards but do not yet exist and are not yet under development.
Which cloud standardisation activities, including in Europe and internationally, should the government or the private sector in the Netherlands be able to influence? For instance, because they are moving in a direction that is not in line with government values, including openness, inclusion, information security, privacy, digital sovereignty and a balanced market. And to what extent is it possible to influence these standardisation activities?
The research was conducted between September and December 2023. Sources were analysed for the research (see Appendix 9) and interviews were held with 27 experts, including from the Netherlands Standardisation Institute, the Netherlands Organisation for Applied Scientific Research, the Association of Netherlands Municipalities, the ICTU, the Netherlands Authority for Consumers & Markets, the Ministry of the Interior and Kingdom Relations, the Ministry of Justice and Security, the National Cyber Security Centre, the Cloud Security Alliance, Microsoft and IBM. (See Appendix 10 for the full list of experts interviewed.)
Additionally, meetings on the subject of cloud and standards were attended. These meetings were the iBestuur conference held on 13 September 2023, the Haven community day held on 31 October 2023 and the ECP Annual Festival held on 16 November 2023.
This report is structured like a pyramid as much as possible, starting with the conclusions (Section 2) and recommendations to the government (Section 3), followed by the substantiation (Section 4).
The methodology, scope and basic assumptions and the results of the desk research are included in the appendices. Appendix 5 bgives an explanation of the cloud and cloud services, types of deployment of cloud services and a list of major cloud service providers worldwide, in Europe and in the Netherlands.Appendix 6 sets out the scope and basic assumptions of the research, including a summary of the renewed Government Cloud Policy of August 2022. Appendix 7 bcomprises an overview of global trends, European developments and initiatives from the Dutch government regarding cloud policy.
The researchers arrived at the following conclusions based on the research:
There are not yet enough open data portability and interoperability standards that are generally accepted and widely supported. Partly because of this, the government is becoming more and more dependent on cloud service providers.
There are still no European standards, particularly for the portability of complexe data, such as databases. Currently, providers are developing their own data transfer technology. For portability of less complex data at the level of files, however, tried-and-tested open standards, such as IMAP, WebDAV, JSON, XML and CSV, can be used. The government should make support of this for the sharing of files between cloud services mandatory.
Indeed, many standards for cloud security overlap, especially as far as management, processing and certification are concerned. For this, the government should choose a clear line and, in doing so, align itself with the current state of affairs in Europe and internationally.
The interweaving of commercial cloud services with proprietary and other types of AI is making it more difficult to achieve data portability and interoperability based on open standards. Providers are leveraging AI to increase customer dependency on their services.
Cloud service providers are prepared to some degree to move with the times if the government takes the lead and provides clarity on the open standards that should be supported. One example of this is the Haven standard, which was developed by the Association of Netherlands Municipalities. By aligning policies at a European level, the major hyperscalers will be more willing to follow suit.
New norms and standards are emerging in the wake of the debate on digital sovereignty and imminent legislation, like the Data Act. That is why it is important to continue to actively monitor standards for cloud services.
These conclusions are substantiated in the detailing of the service models in Appendix 5.
“I see no gaps in standards for cloud security. There are in fact too many overlapping standards in this domain”.
Linda Strick, director of the Cloud Security Alliance
The survey produced a clear idea of the challenges related to the increasing use of cloud services and facilities. Several recommendations for the government emerged from the conclusions of the survey; these recommendations are presented here.
Major cloud service providers are prepared to some extent to collaborate on standards that facilitate interoperability and data portability. At the same time, dependence on these providers is growing. This tendency is clearly evident in the market study carried out in 2022 by the Netherlands Authority for Consumers & Markets (only available in Dutch)
A 'winner takes all ' market dynamic is emerging for cloud services, which means that the market share of the largest hyperscalers keeps on growing at the expense of smaller market players. Interweaving cloud services with proprietary AI is exacerbating this development. Smaller market players lack the knowledge and computing power required for large-scale AI, which in turn strengthens the position of the major hyperscalers. The Data Act is a step in the right direction and will bring more equilibrium to the market, but it will have to be implemented, inter alia by developing open standards and making them obligatory. Initiative and control are required for this.
Advice to the government: exercise greater control over the cloud services market, both at a national and European level. Cloud service providers are prepared to go along with this, but are asking for clarity about the standards to be implemented. Open standards at a European level are needed to reduce dependence on hyperscalers and other providers. There are none as yet; the government must work towards this in a European context. In addition, the government and the European Commission are in a position to put pressure on cloud providers to make their proprietary standards public so that third parties can use them.
“The cloud is a network market that does not observe the rules of traditional markets. Open standards are necessary to keep this market healthy”
Rudi Bekkers, Professor of Standardisation and Intellectual Property at the Eindhoven University of Technology.
Knowledge and expertise of public cloud services is scarce and scattered across the government. This is in contrast to on-premise cloud environments, where much more knowledge and experience is available. Several respondents indicated that there is little coordination in the government for all the measures and actions that need to be taken to keep up with and manage the rapid development in cloud services. Without its own aggregated expertise, the government is also growing more dependent on market players in terms of knowledge, and in particular the major cloud service providers based abroad.
Advice to the government: focus on training, knowledge building and knowledge retention, and on collaboration between government organisations. In this respect, align with national and international organisations, such as the Netherlands Standardisation Institute and the Cloud Security Alliance. Consider making a public-sector organisation responsible for this, so that coordination is organised centrally. This could be the National Academy for Digitalisation and Computerisation of the Government (RADIO). And attract more experts to work in the civil service.
A great deal is being done regarding standards and norms frameworks for information security and privacy. Some developments overlap, and this can result in ambiguity. It came to light during the research that government organisations working on security controls are not always aligned with international standards or organisations. This is despite the fact that international collaboration can strengthen the government’s position. The Ministry of the Interior and Kingdom Relations, for instance, is working on a control baseline based on Secure Cloud Business Applications (SCuBA).
Advice to the government: conduct in-depth research into the cohesion and overlap between national, European and international standards frameworks. After that, choose a clear line, make decisions and align with European and international organisations and developments. Make sure that they are transposed into standards so that organisations are obliged to observe them.
As it stands now, the Standardisation Forum is focusing solely on standards for data exchange between organisations. This renders it difficult to mandate standards such as IMAP, which are crucial for data portability and vendor independence, but are generally not used for data exchange standards between organisations.
Advice to the Standardisation Forum: investigate whether the scope of the ‘comply or explain’ list can be broadened as part of their existing mandate to include standards that promote vendor independence and digital sovereignty, even if these standards are not predominantly applied to data exchange between organisations.
There are still few open standards that support data portability. ISO 8000 provides guidelines for effective and efficient data exchange between different systems, organisations and technologies. This involves the standardisation of formats and terminology to avoid misunderstandings and errors. That said, there is no set standard for exchanging data included in databases.
Advice to the government: set up a working group to identify open standards for data portability and submit these standards for inclusion on the Standardisation Forum’s ‘comply or explain’ list to make them mandatory. This could involve standards that are still under development or are yet to be developed. In the latter case, the working group could indicate who should develop these standards, preferably in a European context. Make practical decisions for the period of time during which these standards are not yet available. Opt for instance for the Amazon Simple Storage Service (S3) API specification for file sharing until an open standard is developed for this. Draw up a list of open-source database formats that have to be supported as a minimum requirement for cloud providers to implement, and those formats such as Postgres, Mysql and Mongo, which are used by government agencies.
For the purposes of promoting interoperability between cloud services from different providers, this study identified several standards that are not yet on the Standardisation Forum’s list of open standards, but may have to be made mandatory.
Advice to the government: carry out further research to determine which of these standards are eligible for inclusion on the Standardisation Forum’s ‘comply or explain’ list. In the process, align with the standards mandated pursuant to the Data Act, and put pressure on providers to implement them. This research can be conducted by the same working group mentioned in the advice given in Section 3.6.
As far as standards for system and application portability in the cloud are concerned, Haven (an initiative of the Association of Netherlands Municipalities) has laid a good foundation for uniform use of Kubernetes. Commercial hyperscalers endorse and support Haven and there are already various kinds of cloud deployments at municipalities. This development contributes significantly to the improvement of interoperability.
Advice to the government: submit the Haven standard for inclusion on the Standardisation Forum’s ‘comply or explain’ list. Help to fortify Haven’s development by ensuring wider application and sound funding. Strengthen Haven by lending support to a wider range of features, which will continue to develop the standard to become a replacement for proprietary services.
The list of the Standardisation Forum’s recommended standards ncludes standards that support data portability at the level of files and should be made obligatory for cloud providers. The concerns, for instance, open standards such as IMAP, which can be used to transfer email messages from one cloud provider to another. WebDAV wis another example; it can be used to manage standard interfaces. By putting these standards on the ‘comply or explain’ list, governments will have to insist on them in their tenders for cloud services, and suppliers will therefore have to offer them.
Advice to the government: submit the recommended standards that are relevant for data portability for inclusion on the Standardisation Forum’s ‘comply or explain’ list. Put pressure on cloud suppliers to implement open standards based on the government’s strategic vendor management software (VMS). Even better would be to put pressure in alliance with like-minded Member States at European level.
Standards and frameworks for standards are crucial, but sharing knowledge and experience on their application is also important.
Advice to the government: also establish best practices for deploying cloud services using frameworks for norms and standards, just like ISO 27002 is a kind of best practices elaboration of ISO 27001. This provides more practical points of reference to government authorities that have to request the standards and apply frameworks for standards.
The research used the three service models for cloud services, namelyInfrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS), as defined by NIST SP 800-145. Appendix 5 explains these three service models in greater detail. Government organisations procure all these service models from suppliers.
The three service models for cloud computing require a range of standards to promote interoperability, information security, privacy and portability:
Security and privacy standards: Implementing security standards contributes to mitigating the risk of unauthorised access to information that could affect the information security of the owners and users of that information. The standards cover aspects such as data encryption, authentication, authorisation and audit logging.
Privacy standards focus on the reliable handling and maintaining the confidentiality of personal data stored in the cloud and/or processed there. Examples of this include the standards for data masking, anonymisation and pseudonymisation.Portability standards These standards make it easier to move applications and data from one cloud environment to another. Here we have in mind the standards for containerization and orchestration. They can help to avoid vendor lock-ins and support multi-cloud strategies.
Interoperability standards: These standards ensure that the various cloud services and components can communicate and exchange data with each other in a standardised way. They can also help to avoid vendor lock-ins and support multi-cloud strategies.
Other standards: These are standards that cannot be classified as above but are relevant to cloud and cloud services and should therefore be noted.
The underlying report covers the following aspects for each type of standard:
Norms.
Existing standards included on the Standardisation Forum’s list of open standards.
Existing standards and standard technology not yet included on the Standardisation Forum’s list of open standards.
Standards that are under development.
Missing standards, ‘existing gaps’.
When differentiating between standards and norms, the research used the following definitions: a standard is a generally accepted specification or best practice guideline, whereas a norm is an officially established requirement that products, services or processes must meet.
This document uses the word ‘standard’ in the broad sense. This includes open standards and vendor-dependent (‘proprietary’) standards, as well as technology that can be construed as de facto standards.
Privacy and security are key focus areas for users of cloud services. Cloud facilities have to meet all privacy and security requirements, just as they currently apply to on-premise facilities.
Security and privacy standards can be viewed as belonging to several levels, namely strategic, tactical and operational. The interviews with experts revealed that there are various and partly overlapping certification schemes at the tactical level, partly overlapping auditing frameworks at the operational level, as well as several cloud security frameworks and guidelines.
In the context of the cloud, it is essential that personal data is stored and processed securely. For this, there are three types that fall within the framework of European legislation. Personal data is:
stored and processed in the EU;
processed and stored in countries subject to an adequacy decision; or
processed and stored in a country where there are safeguards in place for those processing operations that demonstrate that personal data has the same protection as if the processing were performed in the EU.
Currently, it is not always clear to users where the cloud facility is physically located and who has access to it. Also, thus far managing cloud facilities in countries that do not have the same rules regarding the protection of personal data is quite common. Cloud providers are working hard to create cloud facilities that comply with one of the three aforementioned types.
Security and privacy standards largely build on the security and privacy standards already included on the Standardisation Forum’s ‘comply or explain’ list. This concerns:
TLS: TLS is designed to protect internet connections, with the aim of securely exchanging data between internet systems (such as websites or mail servers).
DNSSEC: Recipients can use domain name system security extensions to check the authenticity of domain name information, including IP addresses. This prevents, for instance, attackers from manipulating IP addresses without being detected (DNS spoofing) and in the process redirecting sent emails to one’s own mail server or misdirecting users to a fraudulent website.
STARTTLS en DANE: Email traffic between mail servers goes via SMTP. In addition to SMTP, STARTTLS together with DANE prevent eavesdropping or manipulation of email traffic by cybercriminals.
HTTPS en HSTS: HTTPS and HSTS together make sure that internet connections with websites are secure, with the aim of securely exchanging data between internet systems (often a web browser). This makes it more difficult for cybercriminals to redirect traffic to fake websites and intercept the content of web traffic.
NEN-ISO/IEC 27001: The ISO 27001 standard sets out the requirements that an information security management system (ISMS) must comply with.
SAML: Security assertion markup language (SAML) is a standard for securely exchanging user authentication and authorisation data between different organisations. SAML makes it possible to securely access services from different organisations over the internet, without needing your own login details for each service, or having to log in to each service separately. DigiD and eHerkenning, for instance, use SAML.
The security and privacy standards largely build on the security and privacy standards already included on the list of recommended standards. This concerns:
Oauth2.0 : Users or organisations can use OAuth 2.0 to give a programme or website access to specific private and other data stored on another system, without disclosing their username and password.
It was noted during the research that the attributes exchanged under OAuth 2.0 need more detailing when used in the Ministry of Justice and Security’s Trusted Cloud;
IP Sec: The standard makes it possible to encrypt IP connections. This secures the network and facilitates the exchange of sensitive information. It is particularly relevant for VPNs. The protection at the level of transportation is more appropriate in other cases.
OIDC: OpenID Connect (OIDC) is an open and distributed way to reuse one authentication service of choice across several government and semi-public sector service providers, for instance, when it is used in web applications and mobile apps. The main reasons for using OIDC are the active developments and the mobile-first strategy for supporting digital public services.
Certification Scheme for Cloud Services (EUCS): the Certification Scheme for Cloud Services (EUCS) ENISA is developing on behalf of the European Commission, and CEN/CENELEC is converting it into two European standards. The scheme is expected to be available in 2024. It sets standards for the security of data stored and processed in the cloud. The aim is to bolster trust in cloud service providers while at the same time ensuring compliance with EU regulations such as the General Data Protection Regulation (GDPR). Cloud service providers can then be certified at three different assurance levels (basic, substantial and high). The certificates are then recognised at a European level and also play a role in the development of European regulations if the cloud service provider is ‘expected to be compliant’ if it is certified according to EUCS. Based on EUCS, cloud service providers can demonstrate their compliance with high security standards, which is essential for businesses and organisations that want to store and process sensitive data in the cloud. A noteworthy fact is that there is still no link between the GDPR and this certification.
CCM-framework: The Cloud Controls Matrix (CCM) is a cybersecurity control framework designed for cloud computing environments. It provides a detailed structure of security policies, procedures and technical measures that can be applied to various cloud service models, including infrastructure as a service (IaaS), platform as a service (PaaS) and software as a service (SaaS). In addition to its key control objectives, CCM includes:
implementation guidelines;
a model for the shared responsibility for security;
audit guidelines;
identifying and analysing other relevant security standards and frameworks, and legal and regulatory requirements;
continuous security statistics;
assessment questionnaire (consensus assessments initiative questionnaire – CAIQ).
The CCM is also an open standard that is available free of charge. The CCM is the backbone of the Security, Trust, Assurance and Risk (STAR) programme of the Cloud Security Aliance (CSA), which is a broadly applied cloud assurance programme that comprises an ecosystem of best practices, standards, technology and audit partners. STAR supports organisations when it comes to effectively and efficiently addressing the defining of trust in the cloud, promoting accountability, assessing risks, measuring assurance and simplifying compliance and procurement.
As part of the STAR programme, organisations can demonstrate compliance with CCM requirements using a range of assessment mechanisms, such as:
STAR self-assessment: a self-assessment based on a standardised questionnaire (CAIQ).
STAR certification: an independent third-party certification process based on ISO27001 requirements, supplemented by CCM controls and additional transparency requirements.
STAR attestation: an independent third-party attestation process based on SOC 2 requirements, supplemented by CCM controls and additional transparency requirements.
The STAR programme requires organisations to publish details of their security and compliance position, including compliance with regulations, standards and frameworks, in a publicly available register called the STAR Registry. This information is valuable for existing and potential customers seeking assurance on the security practices of cloud service providers. In summary, the STAR programme and CCM provide a structured approach for organisations, cloud service providers as well as users, to improve and raise awareness of their cloud security practices, making it easier to handle risk management, compliance with regulations and transparency in the cloud computing space.
QERMS (Qualified Registered Electronic Mail Service): The Qualified Registered Electronic Mail Service is an advanced kind of electronic communication intended to replace traditional registered post. It is a legally recognised way to send and receive electronic messages with a high level of security, verifying the identity of the sender and receiver, and safeguarding the integrity and incontrovertibility of the content sent. This means that the sending as well as receiving times of messages are accurately recorded, which means that QERMS is ideal for legal and official correspondence for which proofs of delivery and receipt are essential. QERMS is often used in business and government environments for reliably exchanging sensitive or legally binding documents. QERMS has been set up in accordance with EU Regulation eIDAS (EU) No. 910/2014 and is based on ETSI 319 401, ETSI EN 319 521 and ETSI EN 319 531.
NTA7516: NTA 7516 is a Netherlands technical agreement (NTA) that provides guidelines for the secure exchange of medical information via email. This standard was specifically developed to safeguard the privacy and security of patient information when medical records are sent between healthcare providers and patients or between various healthcare facilities. NTA 7516 sets requirements for aspects such as the identification and authentication of the sender and receiver, the encryption of the data, and the integrity and confidentiality of the information sent. The purpose of this standard is to ensure that electronic communications in the healthcare sector comply with the strict privacy requirements as laid down in the General Data Protection Regulation (GDPR), and to facilitate secure and reliable exchange of medical information.
MTA-STS (Mail Transfer Agent Strict Transport Security) is a security standard that enhances the security of email transfers between servers by enforcing TLS (transport layer security) encryption and specifying the required TLS policy levels. This standard is designed to address common security problems, such as man-in-the-middle attacks, which involves intercepting emails during transport. By publishing an MTA-STS policy on their domain, domain owners can indicate that their servers support TLS and define which version of TLS to use, which ensures that emails are exchanged more securely. This helps to safeguard the confidentiality and integrity of emails during transfer and is an important step towards having a more secure email infrastructure. In this way, MTA-STS improves the security of email communications by ensuring that connections are encrypted and by reducing the chances of interception or tapping.
Comment: The expert opinion on STARTTLS and DANE of August 2018 calls on the Standardisation Forum to evaluate the state of the MTA-STS alternative technology in a year’s time, after the STARTTLS and DANE expert meeting.
mTLS: Mutual TLS (mTLS) is a security protocol whereby both the client and server authenticate each other based on transport security layer certificates, a process that provides an additional layer of security on top of the standard TLS/SSL handshake. As opposed to standard TLS, for which only the server gives its identity to the client, mTLS requires both parties to prove their identity based on digital certificates. This reinforces security by giving both parties assurance about the identity of the other, which in turn reduces the risk of data interception or forgery. mTLS is often used in environments with stringent security requirements, such as in financial services, healthcare and internal network communications in the business sector. It provides more reliable and more secure communications because unauthorised access to networks and data is prevented.
There are many ISO-standards in the field of information security, cybersecurity and privacy protection, including the ISO 27100 series. They are generic standards. The following ISO standards are intended for the cloud:
ISO/IEC 27017: IISO/IEC 27017 is an international standard that provides guidelines and best practices for information security in cloud environments. This standard is an elaboration of ISO/IEC 27002, which specifically focuses on cloud security, and provides additional security checks and implementation guidance for cloud service providers as well as cloud users. ISO/IEC 27017 deals with aspects such as cloud infrastructure security, virtual machine management, data encryption, and operational security procedures in the cloud. It helps organisations identify and manage the security risks associated with using cloud services, and it supports organisations when it comes to complying with regulations and industry standards. By observing this standard, organisations are in a better position to ensure the integrity, confidentiality and availability of their data in the cloud, which is crucial in today’s digital age.
ISO/IEC 27018: ISO/IEC 27018 is an international standard that provides guidelines for protecting personal data in the cloud. This standard is a code of practice for cloud service providers that process personal data, and complements existing ISO/IEC 27001 and 27002 standards for information security management. ISO/IEC 27018 focuses specifically on privacy issues, including the management of personally identifiable information (PII), transparency for the use of data, and stringent security measures to protect the privacy of users. This standard is particularly important for organisations offering or using cloud-based services. It helps them to meet privacy requirements laid down by law and to maintain customer and stakeholder trust by demonstrating that they are taking serious measures to protect personal data.
No existing gaps have been defined for privacy and security standards. The perception is that there is no shortage of norms, standards and frameworks, of which some even overlap. When it comes to privacy and security, efforts should in fact be made to reduce this overlapping.
Portability standards are guidelines and specifications designed to facilitate the portability of data, software, the systems between the various platforms, and systems and devices. In the context of cloud standards, portability standards are essential for transferring applications, systems and data from one cloud environment to another.
We differentiate between two types of portability standards:
Standards for system and application portability: This concerns the ability of software to function on different hardware or operating systems without significant changes. These standards help to reduce the dependencies of specific platforms.
Standards for data portability: This refers to the ability to move data easily from one system or platform to another. Data portability is essential in the digital age, in which information often has to be transferred between various applications, databases or storage systems. Data portability standards ensure that the transfer of data goes smoothly, while at the same time maintaining the integrity and the usability of the data.
System and application portability are founded on containerization and virtualisation. Containerization is a technology that packages applications and their dependencies in containers, creating a consistent, isolated and lightweight environment for applications. The result is containers that are easy to move between the different cloud providers. This approach promotes portability, scalability and efficiency, and is crucial for modern development methods.
In addition to containerization, virtualisation also has an important role to play. Virtualisation is a more substantial kind of abstraction of the physical machine, because virtual machines have a full OS.
In practice, these technologies are often used to complement each other. Many companies use virtual machines to create robust, isolated environments for their infrastructure, while using containers in those virtual machines to manage their applications efficiently and consistently.
The following system and application portability standards were mentioned during the research. They are not on the Standardisation Forum’s list of open standards:
Kubernetes: Kubernetes, often abbreviated as K8s, is a powerful open source system for managing container applications in a cluster. Originally developed by Google, it is based on their internal system, which is called Borg. Kubernetes was released in 2014 as open source software. Kubernetes is currently the de facto global standard for container orchestration. Kubernetes is an open source platform designed to automate the deployment, scaling and management of containerised applications, making it easier to deploy and manage complex applications reliably and at a large scale. Kubernetes is widely supported by virtually all cloud providers. The three hyperscalers offer everyone a standard setup based on Kubernetes. Strictly speaking, Kubernetes is not a standard; instead it is an open source technology and, with that, a standard technology to promote portability.
OCI (Docker): The Open Container Initiative (OCI), is a Linux Foundation project that was established in 2015 by Docker and other container industry leaders. The purpose of OCI is to create open industry standards for container formats and runtimes. A key part of the OCI is the specification of container runtimes and image formats. Docker, as a leading platform in containerization, plays a crucial role in these standardisation efforts. Docker containers are based on OCI specifications, which means they are compatible with other OCI-compliant tools and systems. This ensures consistency in the way containers are built, shared and implemented, regardless of the underlying environment. Docker has also contributed to the development of key standards and tools in the OCI ecosystem, and has promoted the general adoption and success of containerization in the software industry. OCI specifications are available on GitHub.
Haven: Haven is the implementation profile for Kubernetes in the Netherlands. It prescribes a specific configuration for Kubernetes;that should be implemented on existing technical infrastructure, for instance, on a cloud or on-premise platform. In the process, it provides for a standard configuration intended for the government in the Netherlands. The prescribed configuration ensures that every Haven environment is functionally equal, irrespective of the underlying technical infrastructure. Think of it as a layer of abstraction that provides a common starting point. If offers several advantages: uniformity in the technical infrastructure, interchangeability of applications, vendor independence, platform independence and cost reduction.
Terraform: Terraform, Developed by HashiCorp, is an influential open-source infrastructure as code (IaC) tool that makes it possible to define and manage infrastructure using a high-level configuration language. It makes it possible for users to deploy and manage cloud as well as on-premise resources in a consistent and predictable way. Terraform uses declarative configuration files that specify the required state of the infrastructure, which ranges from physical devices, such as servers and network equipment, to high-level components, such as DNS entries, SaaS features and more. It makes it possible for developers and operators to roll out and manage infrastructure in an efficient, repeatable way. One of Terraform’s key features is its support for a wide range of infrastructure providers, such as AWS, Microsoft Azure, Google Cloud, VMWare, OpenStack and many more. The broad scope of its compatibility enables users to implement and manage multi-cloud strategies without having to learn the deployment details of each provider. Even though it is not an implementation of a formal external standard, Terraform was created based on several core principles and designs. Terraform is a standard in the world of infrastructure as code (IaC). The most important standard that Terraform introduced and followed is its own configuration language, which is called HashiCorp configuration language (HCL). Terraform has been included in this report because it is considered to be the de facto global standard. Some time ago, it used to be a fully open source product, but that is not the case now. Even though it has been applied on a large scale, there are also open source forks, of which OpenTofu is the best known.
Open Virtualization Format (OVF): Open Virtualization Format (OVF) is an open standard for packaging and distributing software solutions for virtual machines. The Distributed Management Task Force (DMTF) developed it. OVF is designed to facilitate portability and the easy installation of virtual applications across different virtualisation platforms. This format describes a virtual machine, including the structure of the VM, the necessary hardware resources and the required software images. It also contains metadata, such as product information, licences and configuration options. OVF offers a standard way to package virtual machines and the associated configurations into a single distribution unit. This approach simplifies the management of multi-tier applications, reduces the complexity of deploying and moving VMs between the various environments and provides greater interoperability between different virtualisation platforms. OVF enables organisations and individual users to distribute and manage complex multi-platform and multi-VM workloads simply and easily. OVF is supported by all major virtualisation providers, including Virtual Box, Red Hat, VMWare, Microsoft, IBM, Google and AWS.
ISO/IEC 19941:2017: ISO/IEC 19941:2017 is an international standard that provides guidelines and best practices for cloud computing interoperability and portability. This standard is designed to facilitate the exchange and use of data and applications across various cloud services and platforms. It defines terms and concepts related to interoperability, i.e. the ability of different systems to work together effectively; and portability, i.e. the ability to move applications and data easily between different cloud environments. ISO/IEC 19941:2017 addresses essential topics, such as cloud system design, data format and exchange, and the interaction between the various cloud service models. Its purpose is to support organisations in efforts to reduce vendor lock-in risks, and to improve flexibility and freedom of choice as far as cloud computing solutions are concerned. At the time of writing, ISO was developing a new version, ISO/IEC 19941:2017 of this standard.
In addition to the above, several standards that may become relevant in the future are still under development. Liqo.io is one of them. Liqo.io is an open source project that enables dynamic and seamless Kubernetes federated cluster topologies. It supports heterogeneous infrastructures, including on-premise, cloud and edge infrastructures. The University of Turin in Italy developed it.
The standards listed above are not yet on the Standardisation Forum’s list of open standards. This list of standards addresses most of the requirements for system and application portability for cloud services. So investigating whether these standards can be included on the list of open standards speaks for itself.
Data portability concerns the ability to easily ransfer data from one system to another. ISO 17788 defines data portability as follows:
“The ability to easily transfer data from one system to another, without being required to re-enter data. Note: it is the ease of moving the data that is the essence here.This can be achieved by the source system supplying the data in exactly the format that is accepted by the target system.But even if the formats do not match, the transformation between them can be simple and straightforward to achieve with commonly available tools. On the other hand, a process of printing out the data and rekeying it for the target system cannot be described as "easy."’
When creating standards for data portability, a distinction must be made between data in the form of files, such as photos, videos or electronic office documents, and complex data, such as databases.
A limited number of standards for data portability at the level of files have been included on the Standardisation Forum’s list of open standards, namely:
CalDAV: CCalDAV is an internet standard that is used for the synchronisation and sharing of calendar information on servers. It is an extension of WebDAV (web-based distributed authoring and versioning), a protocol based on HTTP, and it is designed to give users access to scheduling information on a server. CalDAV makes it possible for users to create, modify and delete appointments and scheduled events on a shared server. Changes are automatically updated and synchronised across all user devices.
WebDAV: WebDAV (web-based distributed authoring and versioning) is an extension of the HTTP protocol that enables users to create, edit and manage files on web servers in a collaborative way. This technology makes it possible for several users to collaborate on documents and files as if they were on a local network drive, using functionalities such as uploading and downloading files, creating folders, copying and moving files, and keeping track of versions. Commonly used in a range of applications such as content management systems, online collaboration tools and cloud storage services, WebDAV offers a standardised way for users to access and work with remote files directly through their web browser or specific client software.
Both of these standards have the status of ‘recommended’ on the Standardisation Forum’s list. Several experts advocate that CalDAV and WebDAV should be given ‘comply or explain’ status, which would make it obligatory when purchasing cloud services. Apart from the aforementioned standards, there are also commonly used tools such as rsync and dd for exchanging files between systems.
There are no broadly accepted open standards for cloud storage. Amazon’s Simple Storage Service (S3) kcan be seen as a de facto standard for cloud storage, because it is widely accepted and used in the industry. Many other cloud storage services and tools are compatible with the S3 API (application programming interface). Besides Amazon, several other vendors offer object storage products and services according to the S3 standard. Minio is an example. Microsoft Azure Blob Storage also offers a similar functionality.
Apart from the standards given above, it is worth mentioning DTP. The Data Transfer Project (DTP) is an open source initiative that focuses on facilitating data portability between several internet platforms. Google launched the project on 20 July 2018 and it has partnerships with major technology companies such as Facebook, Microsoft, Twitter and Apple. DTP facilitates customer-controlled bulk data transfers between two online services, which makes it easier for users to move their data between the various platforms.
Using collaborations between the various technology giants, the project’s aim is to create a seamless and efficient experience for users who want to transfer their data, for example when switching email services, social media platforms or data storage services. DTP is in the first stage of development
There is no established standard that limits the standardised exchange of data stored in databases. The following types of file formats, which are independent of databases, are mentioned. They are also included in the Standardisation Forum’s list of recommended standards:
CSV (Comma-Separated Values),
JSON (JavaScript Object Notation)
XML(eXtensible Markup Language)
SQL ISO/IEC 9075 (Standaard SQL)
These are commonly used formats for exporting and importing data between different systems. They are broadly supported and make it easy to transfer structured data.
Several respondents suggested establishing a list of commonly used open source databases and setting the export formats of those databases as the standard. Databases that were mentioned by many in this respect include: Postgres, MySql, MariaDb, Mongo and Redis.
Interoperability standards for cloud computing are crucial for ensuring seamless and effective interaction between different cloud systems and services from various providers. In this way, interoperability promotes a more open, flexible and scalable cloud environment, where users have the freedom to choose and combine services from different providers based on their specific needs.
Portability and interoperability standards in cloud computing are closely connected, but they serve different purposes. Portability standards are used to facilitate the transfer of applications, data and services between different cloud environments, without significant changes or loss of functionality. Interoperability standards focus on safeguarding the compatibility between different cloud systems and services, so that they can together work seamlessly. Interoperability is essential for the creation of a coherent cloud environment with many functions, where different cloud services and components provided by the various providers can integrate and work together effectively.
Even though both areas of consideration for the standards have a different purpose, they are complementary. Good portability facilitates interoperability, because systems that can be moved easily from one platform to another usually also work better with the systems on those platforms. There is, however, overlap between both types of standards. Overlapping standards are included in Section 4.3 (Portability standards).
The following interoperability standards, which are already on the ‘comply or explain’ list were mentioned during the research:
REST (as part of Digikoppeling): Representational state transfer (REST) is an architectural style for the design of network applications. It is commonly used for building interactive applications that use web services. A RESTful system uses HTTP requests to obtain, create, modify and delete data, which means that it is suitable for use in internet applications. REST is relatively simple, lightweight and easy to understand and implement, making it a popular choice for developing APIs (application programming interfaces) in web applications
REST API Design Rules: The REST API design rules standard provides a set of basic structure and naming rules that the government uses to provide REST APIs in a uniform and unambiguous way. This makes it easier for developers to create reliable applications using government API's.
OpenAPI Specification (OAS): OAS gives application developers an unambiguous and readable description of a REST API, which allows them to use the API without having to know how it is implemented. OAS 3.0 facilitates the use and reuse of APIs and reduces dependency on providers.
TheSCIM standard was also mentioned. It is on the list of open standards and its status is ‘recommended’. SCIM ensures that user identity information is in the right place across systems. Using this standard, data that should no longer be in systems – because, for example, the user no longer needs to be in that system – is deleted. Because this is a computerised process, relatively little effort is required to add or delete data. This standard is designed to reduce costs and complexity while building on existing protocols. The purpose of SCIM is to get users in, out and between cloud services quickly, cheaply and easily.
The following interoperability standards, which are not yet on the list of open standards, were mentioned during the research:
FSC NLX: DFederated Service Connectivity NLX software makes it possible for organisations and government agencies to exchange data in a simple, secure and accessible way while at the same time being FSC compliant Among other things, this helps government organisations to comply with new privacy legislation and give the population insight into their data. FSC NLX provides for the following:
setting up secure connections;
making services detectable and accessible;
monitoring and controlling connections within an organisation;
monitoring the use and availability of services centrally;
recording the use of services locally (logging).
FSC NLX is part of Common Ground.
Open Cloud Computing Interface (OCCI): The OCCI is a set of open specifications for cloud computing that the Open Grid Forum developed. It provides an API standard for managing all kinds of cloud infrastructure, including IaaS (infrastructure as a service).
Cloud Infrastructure Management Interface (CIMI): Developed by the Distributed Management Task Force (DMTF), CIMI focuses on managing the cloud infrastructure and aims to have a uniform interface for interaction with infrastructure-as-a-service (IaaS) models.
Cloud Data Management Interface (CDMI): CDMI is a standard that was specifically designed for data storage and management in the cloud. It enables users to create, delete, update and retrieve data and the associated metadata in the cloud.
GraphQL: GraphQL is a query language for APIs and a server-side runtime for handling queries. GraphQL is not tied to a specific database or storage system. Instead it is used to describe existing code and data in terms of an API. It provides a more efficient, powerful and flexible approach to API design than traditional REST APIs do. Clients can use GraphQL to specify exactly what data they need, which reduces over- or under-fetching of data. It also makes it possible for users to compose complex queries, whereby data from several sources can be merged into a single query. As a result, it is particularly useful in modern web and mobile applications, where efficiently managing data transfers and reducing the number of network queries is crucial for performance.
Cloud services, and the hyperscalers in particular, offer all sorts of easily accessible proprietary services supported by proprietary standards. The various cloud providers have different proprietary services and standards, which complicates the required seamless and effective interaction between different cloud systems and services from various providers. The standards mentioned above promote interoperability but are not sufficient to cover it completely. Hyperscalers’ use of proprietary standards, and the intertwining of these standards, does not make it easy to implement open standards. This is where the major challenge lies. European legislation will have to compel the implementation of these standards in due course.
Although the research focused on standards for security and privacy, portability and interoperability related to cloud computing, several related standards and norms also came up for discussion. This concerns the following standards, norms and frameworks:
ISO/IEC 22123-1: Information technology – Cloud computing – Part 1: Vocabulary
Gives definitions for terms used in the context of the cloud, such as: IaaS, PaaS and SaaS. Overlaps with NIST SP 500-292
ISO/IEC 22123-2: Information technology — Cloud computing — Part 2: Concepts
The intention of this standard, entitled ‘Part 2: Concepts’, is to define and specify concepts used in cloud computing. It serves as an extension of the cloud computing vocabulary originally defined in ISO/IEC 22123-1. By elaborating on these concepts in greater detail, ISO/IEC 22123-2:2023 is laying a foundation that supports other documents and standards associated with cloud computing.
ISO/IEC 22123-3: Information technology — Cloud computing — Part 3: Reference architecture.
This standard, entitled ‘Part 3: Reference architecture’, specifies the reference architecture for cloud computing (CCRA). This document is important because it establishes guidelines and standards related to the structure and organisation of systems and services in the cloud computing environment. The reference architecture described in this document offers a structured and detailed blueprint for setting up and managing cloud-based systems, making it an essential resource for professionals in the field of cloud computing.
NIST SP 800-154: This is a list of the National Institute of Standards and Technology’s definitions of cloud computing, which provides a clear and concise framework for understanding cloud technology.
ETSI cloud standards: The European Telecommunications Standards Institute applies various standards and specifications for cloud services, and focuses on interoperability, security and SLA's
ENISA Cloud Computing Risk Assessment: The cloud computing risk assessment, developed by ENISA (the European Union Agency for Cybersecurity), is a detailed document that assesses potential risks associated with the adoption of cloud computing services.
ISO/IEC 38500: ISO/IEC 38500 is an international standard that provides guidelines for effective corporate governance of information and communication technology (ICT).
FinOps-framework: The FinOps framework is a set of principles designed to help organisations manage and optimise their cloud costs more effectively.
ISO/IEC 19086: ISO/IEC 19086 is an international standard that provides guidelines and best practices for service level agreements (SLAs) for the cloud. These standards help to define, document and agree to service level objectives, measurements and responsibilities between cloud service providers and their customers.
ISO/IEC 19944: ISO/IEC 19944 (Parts 1 and 2) is an international standard for cloud computing and distributed platforms, with a special focus on establishing a framework for data flow and data categories in the cloud. This standard sets out guidelines for classifying data, including its origin, movement and use within cloud and distributed computing environments. It helps organisations to identify the various types of data processed in the cloud, such as user data, operational data and metadata. It also provides recommendations for managing and handling this data, taking into account issues such as privacy, security and compliance.
No existing gaps have been defined for general standards.
This appendix gives an explanation of the cloud. What does cloud policy focus on? The various cloud services and cloud variants are explained, and an explanation of the growing importance of cloud computing and a list of major cloud providers are given.
As a reference model for the definitions of cloud computing, we use the The NIST Definition of Cloud Computing in this report. This reference distinguishes five essential features of cloud services:
On-Demand Self-Service: Purchasers of cloud services can unilaterally obtain computing capabilities as required, such as server time and network storage. This can be done automatically, without human interaction with cloud service providers.
Broad Network Access: Using standard mechanisms, functionalities are available across networks to different types of clients, such as mobile phones, tablets, laptops and workstations.
Resource Pooling: The provider’s computing resources (such as storage, processing, memory and network bandwidth) can be shared to serve several customers using a multi-tenant model, where resources are dynamically allocated based on customer demand. It gives a sense of location independence in that the purchaser generally does not have any control or knowledge of the exact location of the resources provided, but may be able to specify the location at a higher level of abstraction, for instance the country, state or data centre.
Rapid Elasticity: Computing resources can be provided and released ‘elastically’, in some cases automatically, to be able to up- and downscale quickly. It often seems to purchasers as though the available computing resources are unlimited and can be appropriated in any quantity, and at any time
Measured Service: This involves cloud systems measuring and optimising the use of computing resources automatically. This is done at a set level of abstraction, for instance, storage, processing, memory and network bandwidth. The use of computing resources can be monitored, managed and reported, which provides transparency for the provider as well as consumers of the service used.
NIST distinguishes between three service models for cloud services. Government organisations purchase these different types of service models. The three main types of cloud services are Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).
Infrastructure as a Service (IaaS): IaaS gives users access to essential infrastructure, such as physical machines, virtual machines, network, storage and other fundamentals, without users having to own or maintain the actual hardware. For the Dutch government, this could reduce the demand for large data centres or server farms, because these resources can be obtained on demand from the cloud.
Platform as a Service (PaaS): PaaS takes it a step further by not only providing the basic infrastructure but also a platform on which applications can be developed, run and managed. Examples of this are operating systems, databases, web servers, development tools, access management, identity management, portal functionalities and integration facilities. For government agencies looking to build unique applications for their services, PaaS can be a valuable tool because it streamlines the development process without having to worry about the underlying system management.
Software as a Service (SaaS): This is probably the best-known model; it involves users accessing software applications via the web. Here we have in mind, for instance, email services, CRM systems or collaboration tools, such as office applications (for instance Microsoft365), client management (CRM, for instance Salesforce), and software development (for instance GitHub). For the Dutch government, this means that the various departments and agencies can have access to the latest software without having to worry about installations, updates or compatibility issues.
Comment: in the forthcoming EUCS (as well as ISO 22123) the term ‘as a service’ will be replaced by ‘cloud capability types', i.e. ‘infrastructure capability', ‘platform capability’ and ‘application capability’. In this report, we use the terminology used at the time of writing.
For the government, this means that the three models can help to deliver services more efficiently and to respond to changing technological needs while reducing overheads. By choosing the right mix of IaaS, PaaS and SaaS, the Dutch government can create a technological infrastructure for primary processes, one that is simultaneously flexible and robust and is in line with frameworks standards.
During the survey, respondents mentioned that, in practice, there is actually no clear separation between IaaS and PaaS. The three hyperscalers (Google, Microsoft and AWS) and other cloud providers deliver a mix of these two services. Generally speaking, a range of additional services are provided through app stores in an IaaS environment, such as database access, AI capability and authentication and authorisation services.
NIST distinguishes between four types of cloud services deployment at a cloud provider:
Public: The software and data are then entirely on the cloud provider’s servers and generic functionality (which is the same for all customers) is provided.
Gemeenschappelijk: The cloud facility is accessible to a limited group of customers who trust each other enough to share this facility.
Privaat: Work is done on a (virtual) private ICT infrastructure. In this cloud, the user has full control over data, security and the quality of service. Applications made available via the private cloud use shared infrastructure components that are used for one organisation only.
Hybride: This type of cloud is a combination of the above types of cloud deployment.
The Enisa’s Cloud Cybersecurity Market Analysis and other background documents also mention another type of deployment, which is similar to the one mentioned by respondents: the multi-cloud.. Using the multi-cloud is a type of deployment that, like the hybrid variant, combines several deployment variants, combining deployment from different providers.
Cloud computing provides numerous benefits for both individuals and organisations. Below we list a few of the most prominent advantages:
Cost savings: By using the cloud, customers can save on the costs of purchasing and maintaining hardware. They often only pay for what they actually use. The Netherlands Authority for Consumers & Markets’ market study into cloud services bconfirms this impression, because large data centres clearly offer economies of scale and they can therefore offer cheaper services than small data centres can.
Scalability and flexibility: One of the biggest advantages of cloud services is their ability to easily and quickly scale up as an organisation’s demand for digital applications to work optimally rises in terms of volume, without the need for major investments in physical hardware. And the hyperscalers of major Cloud providers can support applications that require tremendous computing power.An example of this is artificial intelligence (AI), with its best-known application, ChatGPT.
Accessibility and mobility: Data and applications in the cloud can be accessed from any location that has access to the internet. This facilitates teleworking and access while on the move. Moreover, it also makes it possible to collaborate better, such as when working together on a document.
Security and compliance: Although security in the cloud is a much-discussed topic, many cloud providers provide advanced security features, which companies may not be able to implement themselves because they lack the knowledge or other resources. Reputable cloud providers offer advanced security features and can assist with complying with strict regulatory standards.
In many ways, the cloud computing market in the Netherlands is a reflection of the wider European and global market, but it also has its own unique characteristics. Here is an overview of cloud providers in the Netherlands:
The top three hyperscalers vtogether have the lion’s share of cloud services. The following hyperscalers operate for the Dutch government:
Amazon Web Services (AWS
Microsoft
Google Cloud
Besides these hyperscalers, the following companies are active in the Dutch cloud market:
IBM Cloud
Oracle Cloud
VMWare
Red Hat
OVHcloud
Alongside these global players, there are a few Dutch companies that operate in the cloud:
KPN Cloud
TransIP
LeaseWeb
Interxion
It is worth mentioning that the major cloud providers operating internationally are virtually all American with branches in Europe; OVHcloud is the only European player. Now that government organisations are permitted to use the public cloud (subject to certain conditions), the market share of these hyperscalers in government is growing. The threat of a limited spread across the market and the origin of major suppliers outside Europe calls for regulation through (European) legislation and underlying norms and standards.
An important starting point for the research was the letter from State Secretary van Huffelen to the House of Representatives dated 29 August 2022, in which she defines a change from the government policy that has been in place thus far. This letter provides information on the government-wide 2022 cloud policy. This policy concerns the central government’s use of public cloud services, replacing the previous 2011 policy that focused on private cloud services. The State Secretary’s letter permits government organisations to use the public cloud.
Key points of the cloud policy as defined in the letter from the State Secretary:
Public services are permitted to use public cloud services, subject to certain conditions and exceptions. Parts of the government that are not part of the civil service are advised to follow this government policy.
Processing personal data in public cloud services requires an approved pre-scan data protection impact assessment. If it concerns a high-risk case, a comprehensive data protection impact assessment (DPIA) is required.
Each departement is responsible for having insight into the risks of the using public cloud applications.
‘Implementation guidelines on cloud risk assessment’ will be compiled before the end of 2022. This document has since been compiled, and is an implementation framework as opposed to guidelines
Exceptions:Using public cloud services for information classified as state secrets is not permitted. The Ministry of Defence does not fall within the scope of this policy.
Conditions:
Departments have to formulate their own cloud policy
A relevant risk assessment is required.
An annual report on the use of public cloud services has to be submitted to the State Committee on Integrity in the Civil Service.
An ‘exit strategy’ has to be laid down in contracts with cloud providers.
The provision of cloud services must meet existing ICT conditions.
Cybersecurity is essential, particularly when it concerns data processing in other countries. For this reason, the government also applies the C2000 criteria for cloud usage, which excludes suppliers or services from countries with an active cyber programme that targets the interests of the Netherlands.
The decision-making process has to be public, in accordance with the Dutch Open Government Act Wet open overheid.
The storage and processing of personal data has to be in line with the GDPR.
Extra protection is required for special personal data
If it concerns the storage and processing of a key register, or a source of a key register, public cloud facilities cannot in principle be used.
The letter stresses the importance of a balanced approach, taking advantage of the benefits of public cloud services while managing the risks.
The Standardisation Forum formulated the research question. The Standardisation Forum advises the public sector about the use of open standards. In the process, the Forum applies various objectives and points of departure. These objectives and points of departure are the basis of the research:
Open standards:The forum promotes the use of open standards. An open standard is a specification that is available and using it is not restricted by patents or licence rights.
Level playing field: Using open standards creates a level playing field for providers of ICT products and services. It encourages innovation and prevents government organisations from becoming dependent on one supplier.
Interoperabiliteit: One of the Forum’s key objectives is to safeguard interoperability. This means that different services from various cloud providers can communicate and exchange data with one another without problems.
Cloud computing needs a range of standards to promote interoperability, security, privacy and portability. The research differentiates between the various types of cloud standards:
Security and privacy standards: Security standards ensure that there is a secure environment, which unauthorised persons cannot access. The standards cover aspects such as data encryption, authentication, authorisation and audit logging. Privacy standards focus on protecting personal data stored or processed in the cloud. Examples of this include the standards related to data masking, anonymisation and pseudonymisation
Portability standards: These standards make it easier to move applications and data from one cloud environment to another. This includes, for instance, standards for containerization.
Interoperability standards: These standards ensure that the various cloud services and components can communicate and exchange data with each other in a standardised way. They can help to avoid vendor lock-ins and support multi-cloud strategies.
Other standards: These are standards that are not covered by the classification outlined above but are nevertheless relevant and therefore deserve to be mentioned.
In addition to standards, we also identify in the research norms and technologies that apply as the de facto standard:
Standards: These are technical specifications or other precise criteria used as rules or guidelines to safeguard consistency and interoperability. They may be drawn up by official standards organisations or by sector organisations, or may even become de facto standards through widespread use.
Norms: In the context of technology and IT, standards are often official documents that set out best practices, methodologies, processes or specifications that are widely accepted. Official standards organisation usually issue norms.
Technologies that can be considered to be de facto standards. This does not refer to technical specifications. Instead, it concerns working technical solutions that are so widely used in the market that they can be construed as a standard.
Section 4 lists existing technologies or standards being developed for each type of cloud standard. ‘Existing gaps’ are described where possible.
Cloud computing has transformed the way businesses, governments and individuals use and approach technology. This dynamic topic continues to evolve, based on new innovations, usage patterns and business models. This section provides an overview of global trends in the field of cloud computing, as well as a summary of cloud developments in Europe and the Netherlands.
Below is an overview of the most important global trends in cloud computing:
Hybrid and multi-cloud strategies:Companies and organisations are more and more inclined to opt for a hybrid cloud approach, which involves combining private as well as public cloud resources. At the same time, they are adopting multi-cloud strategies, using services from several cloud providers, to increase flexibility and reduce risk.
Serverless architectures: Serverless computing, often referred to as ‘function as a service’ (FaaS), enables developers to build and run applications without worrying about the underlying infrastructure and required services. Cloud providers offer this underlying infrastructure and the required services, which accelerates development and can reduce costs.
AI and machine learning integration: Cloud providers are expanding their services with tools and platforms that integrate AI and machine learning. This makes it possible for organisations to perform powerful data analytics and add intelligence to their applications without having to make major investments in advance. All the hyperscalers are busy doing this, and there are great expectations in this regard, given the investments that these organisations are currently making.
Improved security measures: With cybersecurity concerns on the rise, cloud providers are investing in advanced security technologies, such as AI-driven security analyses, encryption and zero-trust security models.
Containers and orchestration: Development based on containers, such a Docker, and orchestration tools, such as Kubernetes, have grown in popularity because they help developers to build applications that can be easily scaled and moved across various cloud environments, which in turn enhances portability.
Sustainability: Based on growing concerns about climate change, businesses and consumers are more and more inclined to examine the environmental impact of technology. Cloud providers are responding to this by building more sustainable data centres and using green energy.
Data sovereignty and local regulations: With stricter data protection laws in a range of countries and regions, cloud providers are busy working on regional data centres and offering specific solutions to comply with regulations.
Global trends in cloud computing are a reflection of a rapidly changing technological landscape and the needs of organisations and individuals. Even though cloud computing continues to evolve, the fundamental principles of flexibility, scalability and on-demand access will continue to be the driving forces behind this transformation.
WAs mentioned earlier, the cloud provides numerous benefits. It is essential that appropriate measures are taken to ensure that developments are controlled in line with the norms and values prevailing in European society. These measures are intended to lead to appropriate laws and regulations that refer to the obligation that cloud providers have to adopt open standards.
The EU views legislation and standardisation as strategic instruments to make cloud services healthier and more secure. Actions for cloud computing in the 2023 Rolling Plan:
“Action 1 - Identify needs for ICT standards and open source technologies to further improve the interoperability, data protection and portability of cloud services and continue or start respective development activities…
Action 2 - Promote the use of the ICT standards needed to further improve the interoperability, data protection and portability of cloud services as well as multi-cloud management.”
Below is a list of interesting European developments that affect the way data is stored, processed and shared at a European level, and developments that influence the design of cloud services:
Data Governance Act: The Data Governance Act came into force on 24 September 2023. The regulation creates a new European way of data governance based on increasing trust in data sharing. Its aim is to create a secure environment for sharing data across sectors and Member States for the benefit of society and the economy. This strategy has direct implications for cloud computing because it is intended to develop sector-specific, shared European data systems. For the Netherlands, this means that government systems have to be compatible and in line with these European initiatives.
Data Act: This European regulation requires cloud service providers to make it possible for users to switch to another provider easily, without losing data and functionality (i.e. portability). The regulation does not prescribe specific standards that providers must use for this purpose; however, it does define several functional requirements that they must meet. Besides the obligation to facilitate portability, cloud service providers also have to enable interoperability. That is to say, users must be able to choose to use two or more cloud service providers in parallel without encountering problems. Pursuant to its findings from market study into cloud services, the Netherlands Authority for Consumers & Markets (ACM) has also insisted on the latter point. Even though the Data Act does not prescribe specific standards for portability and interoperability, it does offer the European Commission the opportunity to draft these standards in the future in the form of delegated legislation. The European Parliament and the European Council have since adopted the Data Act. The Data Act can now be published. It will then take another twenty months from the time of publication for the Data Act to come into force, in other words in the autumn of 2025. A European focus group will assess which frameworks and provisions need to be developed to support Member States in implementing the Data Act. The group will be focusing inter alia on open standards and data spaces.
GAIA-X: This initiative, driven mainly by Germany and France, aims or aimed at establishing a competitive, secure and reliable range of cloud services for Europe. The objective of GAIA-X is to safeguard European values and regulations with respect to data. GAIA-X seems to have shifted its objective of achieving a competitive range of cloud services to jointly developing a digital governance layer that enables organisations and government agencies to keep control of public cloud facilities, making it simpler to interact between cloud facilities and migrate from one cloud platform to another. ‘Control’ in this context also means complying with all sorts of security standards. Various experts have been critical of the results achieved by GAIA-X thus far. Given the potential implications for interoperability and data sovereignty, the Dutch government will have to closely follow developments concerning GAIA-X.
Digitale Soevereiniteit: The EU has expressed its ambition to enhance the digital sovereignty of its Member States. This concerns Europe’s capacity to develop independent digital solutions, including cloud infrastructures. This may have consequences for where and how government information is stored.
Reinforcing the GDPR: DThe General Data Protection Regulation continues to develop with the inclusion of additional guidelines and interpretations. It is crucial for the Dutch government to adapt to these evolving standards, especially in the context of cloud services.
EU Cloud Code of Conduct:This code of conduct, approved by the European Data Protection Authority, provides guidance for cloud service providers on how to integrate the GDPR into their services. It ensures that there is a uniform interpretation of the GDPR in the cloud sector, which is relevant for the Dutch government when selecting cloud partners.
The European Cybersecurity Act:: Implemented in 2019, this Act introduced an EU-wide framework for cybersecurity certification. It is a framework with verifiable criteria. Various organisations have been mandated to certify themselves. There are various levels of this certification. NIS2, the successor of NIS1, is part of the European Cybersecurity Act. Just like the European GDPR for privacy legislation, the European NIS2 will also become mandatory legislation. The ‘Algemene verordening gegevensbescherming’ is the Dutch name for the GDPR. NIS2 also has a Dutch name, ‘Netwerk- en Informatiebeveiligingsrichtlijn’. It has to be implemented as Dutch legislation by the second half (September) of 2024 (21 months after adoption). Incidentally, the same applies to all 27 European Member States. The NIS2 is intended to strengthen cyber resilience by raising security levels and enforcing the adoption of ‘basic measures’ to prevent cyberattacks and reduce their impact. If the Dutch government uses cloud services, it is important to ensure that these services comply with European cybersecurity standards and the Dutch version of the NIS2
European cloud rulebook: To protect European businesses and public organisations, which are increasingly dependent on cloud technologies, it is important that cloud and edge services offered in Europe comply fully with relevant general and sectoral laws, as well as with key European self-regulatory norms and standards relating to security, energy efficiency, data protection, interoperability and fair competition. In recent years, stakeholders from the industry in Europe have been working together to develop these kinds of self-regulating norms and standards. The forthcoming EU Cloud Rulebook will offer an extensive catalogue of these regulations and it will set out the mechanisms that can be used to demonstrate regulatory compliance.
Simpl-project: The Smart Middleware Platform (Simpl) makes it possible for EU stakeholders to pool resources to create more business value and enhance resource efficiency, reduce costs and avoid duplication. This middleware enables connectivity between isolated data centres and EU operators, allowing them to leverage underutilised infrastructure. It will also open up data sources to public institutions, SMEs, organisations and industry to improve services in the public interest. This middleware can be used to achieve the commission’s ambition to create an open marketplace for EU resources, which in turn will lead to efficient reuse of the efforts of other EU organisations. In line with the European Green Deal, the Smart Middleware Platform will be based on energy-efficient software. The services also focus on optimising energy consumption in all sectors.
DOME-marketplace: DOME is an ecosystem that unites all cloud and edge service stakeholders, including infrastructure and platform providers, service integrators, certification agencies and customers from every industry. The aim is to simplify current practices and offer a continuum of bundled services that go beyond national borders.
The developments and challenges in the field of cloud computing are significant. They are too big for a country like the Netherlands to tackle on its own. It is therefore essential for the Dutch government to stay informed and contribute to developments at a European level. By being proactive and pursuing a well-considered approach to cloud adoption, and acting jointly at a European level, the Netherlands can contribute to a secure, efficient and compliant cloud infrastructure for its citizens and institutions.
The cloud transition is in full swing across the world, with the Netherlands being no exception. The Dutch government realises that, when setting conditions for the use of the cloud, it needs to join forces with its European allies under the umbrella of the European Union.
The Netherlands provides the necessary expertise and input in the various European developments noted previously. In addition to this contribution in Europe, the Dutch government has developed various cloud initiatives. Below is an overview of key cloud developments in the Dutch government:
Use of the public cloud subject to certain conditions: AGiven the many advantages of the cloud and the need for computing power for AI, the Dutch government has permitted the use of the public cloud under strict conditions. The 2022 government-wide cloud policy, set out in the letter to Parliament from State Secretary van Huffelen, sets out the conditions and is a prelude to the controlled use of public cloud facilities.
Guidelines on the application of risk management in public cloud services: The cloud policy and the ‘Implementation framework for risk assessment of cloud use’ were adopted in 2022. The policy and framework lay down the frameworks for public and hybrid cloud usage. These guidelines are intended for specialists in project teams, clients (CIOs) and specific officials (CISOs and CPOs) who work on a programme in which public cloud usage plays a role. In addition to this, clients get better insight into the management of risks in public cloud usage. The cloud policy and the implementation framework are obligatory for the civil service when it concerns material use of the public cloud.
Strategic vendor management (SVM): The Strategic Vendor Management group of the Netherlands government has been responsible since 2014 for negotiating agreements on the procurement and contracting of Cloud services and software from Microsoft, Google and Amazon.The tasks include: Advising government institutions on the procurement of software and cloud products and services. Assessing the risks associated with cloud products and services for compliance with privacy legislation (GDPR). Organising national and international networks of purchasers and contract managers of cloud products and services. Entering into contracts on behalf of the Dutch government between the central government and hyperscalers to arrive at appropriate conditions, such as GDPR compliance and compliance with the government information security baseline.
Common ground – Haven: Each municipality organises its IT infrastructure in a different way. For instance, one municipality may be running its operations locally, while another may be using the cloud more. Applications have to be adjusted in line with the infrastructure on which they operate. This is what makes it difficult for municipalities to develop applications together and to deploy them quickly across all municipalities. Haven is a standard for platform-independent cloud hosting. Municipalities can use Haven to host applications anywhere without having to adapt their IT infrastructure. This creates uniformity, lowers costs and reduces dependence on suppliers, among other things. Haven prescribes a specific configuration of Kubernetes that has to be implemented on existing technical infrastructure, for instance, on a cloud or on-premise platform. The prescribed configuration ensures that every Haven environment of any random cloud provider that has implemented the standard is functionally equal, irrespective of the underlying technical infrastructure. Think of it as a layer of abstraction that provides a common starting point. If offers several advantages: uniformity in the technical infrastructure, interchangeability of applications, vendor independence, platform independence and cost reduction. Haven has its own compliance facility.
Baseline Microsoft 365: More and more government organisations are using cloud services. A well-known example of a cloud service is Microsoft Office 365. The use of the service is increasing at the government too. Departmental confidential information at the level of BBN2 may only go to the cloud if more stringent requirements and measures have been met, such as encryption. For instance, compliance with the GDPR, the Civil Service Data Security – Special Information and various norms and standards, such as the government information security baseline, is obligatory. Because of growing use of the Office 365 cloud service, the Dutch government and Microsoft are working together on a baseline for the Microsoft suite.
Finally, the Dutch Government Reference Architecture organisation has a decision tree for risk assessment of cloud servicesas part of the government information security baseline theme for cloud services. In addition, this organisation has a Wiki site for cloud computingFor the time being, this wiki site contains older articles about the cloud, and the Dutch Government Reference Architecture organisation is still looking for an expert group to keep this wiki site up to date.
While cloud computing has many advantages, it also comes with several risks and challenges that the government and other organisations and individuals should take into account. Developments to keep an eye on include issues concerning data privacy, security and integration with existing systems, as well as the expanding position in the market and, with that, the power of hyperscalers. One of the options in this respect is to implement legislation, norms and standards.
The following challenges and risks came to the fore during the research:
Vendor lock-ins: the cloud market is exposed to the significant risk of vendor lock-ins. Once a cloud provider has been chosen, there are major barriers to switching to another provider. The major risk of vendor lock-ins seems to be higher than for traditional on-premise solutions and the reasons for this are as follows:
The government puts cloud services out to tender, which only one service will be granted, so the focus is entirely on that one service. This kind of tendering process takes a great deal of time and effort. This is substantiated in the ACM’s market studystudy into cloud services:
"Once a user has chosen a specific cloud service provider, the barrier to switch to another cloud provider is very high in many cases. The discussions that ACM conducted for this study show that little switching takes place between cloud services of different cloud providers Users of PaaS and SaaS services in particular may experience difficulties when switching. For users of IaaS services, that applies to a slightly lesser extent."
Cloud services, and the hyperscalers in particular, offer all sorts of easily accessible proprietary services. Other cloud providers have different proprietary services that are difficult to transfer from one cloud provider to another. It is mainly the proprietary services that make it difficult to switch. Microsoft’s Office 365 is an example. Once it is put into use, it limits the options of switching to another cloud provider. This point is also emphasised in themarket study into cloud services conducted by the ACM::
"The customer lock-in applies particularly when PaaS and SaaS services are purchased as part of an integrated service offer. . Switching is complex with ICT products and services, which in practice are often closely interconnected with the processes within the organization This is all the truer for integrated cloud services because in many cases new connections have to be made and it is necessary to switch to multiple services at the same time."
The costs for outbound data are much higher than for inbound data. In other words, it is cheap to place data with a cloud provider, whereas transferring data to another place is expensive. This may discourage organisations from switching to a different cloud provider. This creates unpredictability about the final total cost of using cloud services and any savings that may be made by switching provider. The Data Act intends to counter this, precisely to ensure that switching from one provider to another is not hampered by the high costs of moving data.
“Winner takes all”-markt: “‘Winner takes all’ refers to the economic principle whereby the best-performing platforms are in a position to control an entire market or a very large part of it. The cloud market has obvious features of a ‘winner-takes-all’ market. This is confirmed by theCloud Services market study reportbased on research carried out in the United Kingdom:
“There are two leading providers of cloud infrastructure services in the UK: Amazon Web Services (AWS) and Microsoft, who had a combined market share of 70% to 80% in 2022.2 Google is their closest competitor with a share of 5% to 10%.”
There is no reason to assume that this situation for the market in the Netherlands is any different than as indicated in this report. This impression of consolidation is corroborated in the ACM’s market study cloud services. The mechanisms of this consolidation are described in greater detail in this report.
Unpredictable costs:While cloud services may initially save on costs, unexpected expenses may arise with increased usage, especially if organisations do not monitor their usage carefully. Research indicates that estimating cloud service pricing can be extremely challenging. The pricing structure is often set up in a very complex way, rendering it difficult to predict. According to one respondent, cloud costs can often only be determined empirically, or after the fact.Alongside technical and organisational impediments, there are also financial barriers to switching. The price structures that many cloud providers apply are at the root of this. This price structure is complicated: there is a charge for every action taken, gigabyte stored or second of computing power.
Security risks: The data belonging to an organisation is beyond its direct control, which can lead to concerns about data breaches, hacking attempts and other cybersecurity threats. Cloud services can potentially be configured more securely than existing on-premise services because they often take more sophisticated security measures. That said, configuring a cloud environment demands specific know-how and experience with the cloud platform in question. This is where the concern of many respondents lies. This knowledge is thin on the ground and this is one of the elements that introduces security risks.
Risks related to data privacy Storing sensitive data in the cloud can trigger privacy concerns, especially if the cloud provider stores data in another jurisdiction that has different privacy laws. Besides the storage of sensitive data, this involves access to the data from outside the EU, the use of telemetry data and so on. See also the Guidelines for risk management of public cloud services – Government portal (overheid-i.nl)
Risks concerning data ownership and access: SaaS services in particular face this kind of risk. This exposure occurs wherever a service is bought and the purchaser no longer has direct access to the data. This often involves services that used to be provided on the premises, such as municipal licensing systems or applying for municipal public services. These applications are moving from being on-premise to cloud services. Subsequently, the purchaser usually no longer has direct access to the data because it is no longer on the purchaser’s infrastructure, and additional arrangements need to be made for this.
Insufficient knowledge and expertise:Respondents stated that, as it stands now, there is not enough knowledge and expertise at the government and, because of this, the government is being overtaken by events. Setting up a cloud environment is often very complex and calls for knowledge and experience that goes beyond setting up an on-premise system. This know-how and experience is scarce at the Dutch government. In addition, it requires knowledge of the environment of the various cloud providers; for instance, setting up Microsoft Azure requires different knowledge to the know-how required for setting up Amazon Web Services. Research shows that government authorities generally accrue knowledge separately from one another, and that this could improve with collaboration.
The following sources were used as input for this report:
Guidelines on the application of risk management in public cloud services
Letter to Parliament concerning the 2022 government-wide cloud policy
Preparatory work in view of the procurement of an open source cloud-to-edge middleware platform (30 March 2022)
Study presenting assessments of codes of conduct on data porting and cloud switching
Information Security Service’s factsheet for data processing agreementGuidelines for cloud computing
Implementation framework for assessing the risks of cloud usage
Market study into cloud services conducted by the Netherlands Authority for Consumers & Markets
Cloud governance whitepaper Microsoft (May 2021)
Cloud assessment framework J&V version 1.2 (July 2021)
Final report of the Association of Dutch Municipalities’ research into the demand for cloud support at municipalities (26 April 2023)
CSPCERT WG (Milestone 3) Recommendations for the implementation of the CSP certification scheme
CSA STAR programme 2022
The experts that were interviewed during the research are listed below. The selection criteria ensured that there was a representative sample of experts from various government and related organisations involved in the topic of standards for the cloud. Not all experts that the researchers approached responded or were willing to participate in the research. This is the list of experts who were actually interviewed.
Name | Position and organisation |
---|---|
Henrique Barnard | Strategic vendor manager Microsoft, Google Cloud and AWS rijksoverheid |
Frank van Dam | Architecture e-Government ICTU |
Roderick Schaefer | Ministry of the Interior and Kingdom Relations, initiator and former adviser on standards at Haven, Association of Netherlands Municipalities (VNG) and former adviser on standards at Haven, VNG |
Edward van Gelderen | Scrum master Common Ground/Haven VNG (Association of Netherlands Municipalities) |
Peter Wiggers | Kubernetes engineer VNG (Association of Netherlands Municipalities) |
Mathijs Hoogland | Kubernetes engineer VNG (Association of Netherlands Municipalities) |
Michiel Steltman | Managing director of the Dutch Digital Infrastructure Nederland (DINL) |
Jacques Eding | Cloud portfolio holder for the cloud, consultant CISO at the central government |
Inge Piek | Consultant ICT Standards Netherlands Standardisation Institute (NEN) |
Edwin Harmsma | Research consultant Cloud - TNO & Centre of Excellence for Data Sharing and Cloud |
Harro Kremer | Enterprise architect Ministry of Justice and Security |
Chris Eyzenga | Technical CISO Ministry of Justice and Security |
Ruben Faber | Strategic cybersecurity consultant National Cyber Security Centre (NCSC) |
Femke Nagelhoud | Projectmanager en Senior Enforcement Official ACM |
Christiaan Waters | Supervision staff member ACM |
Jacco Hakfoort | Senior supervision staff member ACM |
Pieter Bas Nederkoorn | Product manager for the Common Municipal Infrastructure at the Association of Netherlands Municipalities (VNG) |
Geeske Logtmeijer | Implementation consultant Common Municipal Infrastructure at the Association of Netherlands Municipalities (VNG) |
Bas Huisman | Technical consultant Sociodome |
Linda Strick | Director of the Cloud Security Alliance |
Ruud Kerssens | Lead security expert EU Cybersecurity Certification RDI |
Sander Booij | Enterprise architect IBM |
Artan van Hooijdonk | Principal customer succes accountmanager Microsoft |
Benjamin Tissink | Cloud Security Architect Microsoft |
Erwin van Essen | Customer Succes Director Microsoft |
Jelle Niemantsverdriet | National Security Officer Microsoft |
Linda Durand | National Security Officer Microsoft |
Through Strategic Supplier Management (SLM) for the central government, AWS and Google were also approached for this study, but this did not result in interviews.