Cloud computing

Quick links

Updates

Gartner predicts a 21.4% growth of the public cloud market, leading to a total $186.4 billion by the end of the year. While the largest investment is in software as a service (SaaS) at the moment, the analysis predicts a much higher revenue for Infrastructure as a Service (IaaS) in the next three years. The high concentration on the IaaS market is expected to grow, with the top 10 players accounting for 70% of the market by 2021.

The Clarifying Lawful Overseas Use of Data (CLOUD) Act was signed into law on 22 March by President Donald Trump. The new legislation amends the 1986 Stored Communications Act to allow federal law enforcement to compel U.S.-based technology companies via warrant or subpoena to provide requested data stored on servers regardless of where in the world the data is stored. The CLOUD Act allows for requesting any data on U.S. citizens stored on any server they own or operate when an warrant exists and it introduces expedited procedures for ‘executive agreements’ with foreign governments. While many companies expressed their support for the law in a joint letter on 6 February, civil society groups and human rights organizations warned on 12 March that the CLOUD Act ‘undermines privacy and important democratic safeguards’.

Apple is set to begin hosting Chinese iCloud accounts in China, raising fears about human rights, as state authorities gain easier access to data stored in the cloud, according to Reuters. The article reports that 'Human rights activists say they fear the authorities could use that power to track down dissidents.' Apple issued a statement saying that this is necessary to comply with new Chinese laws requiring that cloud services for Chinese citizens be stored in China, by Chinese companies, noting 'While we advocated against iCloud being subject to these laws, we were ultimately unsuccessful.' Apple said that iCoud keys stored in China will not give access to data stored in other countries, but privacy lawyers say protection for Chinese customers will suffer.

Cloud environment of Tesla, the carmaker, was exploited by an attacker to mine cryptocurrencies, RedLock security firm reports in its study “Cloud Security Intelligence (CSI)”. The unsecured Kubernetes console - an open source system used for operation of application containers, virtualised software and cloud-based services - exposed access credentials to Tesla’s Amazon Web Service (AWS) cloud environment, which allowed attackers to inject cryptocurrency mining scripts as well as to reach out to sensitive data such as vehicle telemetry. The study suggests that the unauthorized use of computing power to mine cryptocurrency - known as cryptojacking - is becoming an increasing threat for cloud environments, such as those of Amazon, Microsoft and Google.

Apple partnered with Guizhou-Cloud Big Data, a Chinese state-owned company, to build Apple’s first data-storage center in China. The iCloud content of Apple ID users registered in China will be sent to and managed by Guizhou-Cloud Big Data starting in March. Apple’s new terms and conditions agreement with China reveals that all personal information and files of Chinese customers stored on the iCloud will be shared with the Guizhou-Cloud Big Data and could be further accesed and scrutinised the Chinese authorities.

Alphabet-owned Google is planning to build three submarine cables in 2019 to expand its cloud computing infrastructure. The first, called the Curie cable, will connect Los Angeles to Chile; the second cable, Havfrue, built in partnership with Facebook, will link U.S. to Denmark and Ireland; the third cable, for the Pacific region, will run from Hong Kong to Guam. With these three connections, Google now has direct investment in 11 cable systems. The work on these three undersea cables should be completed in 2019. While the Havfrue and HK-G cables will be built by joint consortia, the Curie link will be Google’s solo endeavour. ‘With Curie, we become the first major non-telecom company to build a private intercontinental cable‘, informs Google.

Cloud computing could be described as the shift from storing data on hard disks on our computers to servers in the clouds (i.e., huge server farms). Cloud computing offers ubiquitous access to all our data and services from any device anywhere around the world (where there is Internet connection). At the same time, the fact that our data are stored with a third party - often in pieces and copies scattered around several jurisdictions - raises concerns for privacy. Security of the cloud is likely to be on a much higher level than of our own computers, but the risk from penetrating into a system of any cloud provider increases, since each cloud contains vast information about many citizens and companies.

The first wave of cloud computing started with the use of online mail servers (Gmail, Yahoo!), social media applications (Facebook, Twitter) and online applications (Wikis, blogs, Google docs). Apart from everyday applications, cloud computing is extensively used for business software. More and more of our digital assets are moving from our hard disks to the cloud. The main players in cloud computing are Google, Microsoft, Apple, Amazon, and Facebook, who either already have or plan to develop big server farms.

From hard disks to cloud computing

In the early days of computers, there were powerful mainframe computers and ‘dumb’ workstations. The power was in the centre. After that, for a long time, with PCs and Windows applications, computer power moved to the periphery. Will cloud computing close the circle? Are we going to have a few big central computers/server farms and billions of dumb units in the form of notebooks, monitors, and mobile phones? The answer to this and other questions will need time. Currently, we can identify a few Internet governance issues which are very likely to emerge in parallel with the development of cloud computing.

With more services delivered online, modern society will increase its dependence on the Internet. When the Internet went down in the past, damage was limited to the inability to send e-mails or browse the web. In the era of cloud computing, we may not even be able to write texts or perform calculations. This higher dependence on the Internet will imply higher pressure on its robustness and reliability.

With more of our personal data stored on clouds, the question of privacy and data protection will become central. Will we have control over our text files, e-mails, and other data? Could cloud operators use this data without our permission? Who will have access to our data?

With a growing volume of information assets going digital, countries may become uncomfortable with having national information assets outside their national ‘borders’. They may try to create national or regional clouds or make sure that existing clouds are managed with some degree of international supervision. Nationalisation of clouds could be further accelerated by the fact that all main operators in this field are based in the United States. Some argue that the current ICANN-centred debate may be replaced by an Internet governance debate on the regulation of cloud computing.

With a diverse set of operators of cloud computing, the question of standards is becoming very important. The adoption of common standards will ensure a smooth transfer of data among different clouds (e.g. from Google to Apple). One possibility that is being discussed is the adoption of open standards by the main players in cloud computing.

There are a number of working groups on cloud computing, such as The Open Group Cloud Computing Work Group, which includes some of the industry’s leading cloud providers and end-user organisations; and the Cloud Computing Strategy Working Group by the European Telecommunications Standards Institute (ETSI).

The governance of cloud computing is likely to emerge through the interplay of various actors and bodies. For example, the EU is concerned with privacy and data protection. The Safe Harbour agreement, which was meant to solve the problem of different privacy regimes in the USA and the EU, has been declared invalid by the European Court of Justice in October 2015. With more digital data crossing the Atlantic Ocean, the EU and the USA will have to address the question of protection of privacy according to EU regulation by US companies, the main operators in cloud computing. This issue came into sharper focus after the Snowden revelations of mass surveillance.

Standards will most likely be developed through agreements among the main companies. Google has already started a strong push towards open standards by establishing the Data Liberation Front, aimed at ensuring a smooth transition of data between different clouds. These are the first building blocks that will address the question of the governance of cloud computing. Others are likely to emerge as solutions for concrete policy problems.

Actors

The ITU Telecommunication Standardization Sector (ITU-T) develops international standards (called recommendations) covering information and communications technologies. Standards are developed on a consensus-based approach, by study groups composed of representatives of ITU members (both member states and companies). These groups focus on a wide range of topics: operational issues, economic and policy issues, broadband networks, Internet protocol based networks, future networks and cloud computing, multimedia, security, the Internet of Things and smart cities, and performance and quality of service. The World Telecommunication Standardization Assembly (WTSA), held every four years, defines the next period of study for the ITU-T.

The Consortium works on creating an open and interoperable framework for

...

The Consortium works on creating an open and interoperable framework for fog computing â an architecture that distributes resources and services along the continuum from cloud to devices. Several committees and working groups focus on technical work (building operational models and testbeds for fog computing), contributing to the development of standards within relevant standardisation organisations, promoting innovation, and contributing to educating both the industry and the market on the advantaged of fog computing. In February 2016, the Consortium published the OpenFog Reference Architecture, containing details on the eight pillars in an OpenFog architecture: security, scalability, openness, autonomy, programmability, RAS (reliability, availability, and serviceability), agility, and hierarchy.

More and more standards and guidelines developed by ISO cover issues related to data and information security,

...

More and more standards and guidelines developed by ISO cover issues related to data and information security, and cybersecurity. One example is the 27000 family of standards, which cover aspects related to information security management systems and are used by organisations to keep information assets (e.g. financial data, intellectual property, employeesâ information) secure. Standards 27031 and 27035, for example, are specifically designed to help organisations to effectively respond, diffuse and recover from cyber-attacks. Cybersecurity is also tackled in the framework of standards on technologies such as the Internet of Things, smart community infrastructures, medical devices, localisation and tracking systems, and future networks.

The World Wide Web (WWW) was developed at CERN, in 1989, by Tim Berners-Lee. The aim was to allow for automatic information-sharing between universities and research institutes around the world. The first website was also created at CERN, dedicated to the WWW project itself. In 1992, the first readily accessible browser for the WWW was launched. In 1993, the WWW software was put in the public domain and made freely available, thus allowing the web to further develop. The HyperText Markup Language (HTML) and the HyperText Transfer Protocol (HTTP) were developed at CERN as well, based on a proposal from Berners-Lee.

Instruments

Standards

With the rapid adoption of cloud computing services, which allow the storing and accessing of data, applications, and services in and from cloud servers, there are increasing concerns over the security of such services. The distributed nature of cloud computing, the vast amount of data stored in the cloud, and the possibility to remotely access resources stored in the cloud make cloud computing more vulnerable to security threats and challenges that other storage modalities.

Providers of cloud computing services are continuously looking into solutions for enhancing the security of their services, and, therefore, increasing the confidence of their users. At the same time, technical organisations and standardisation bodies are exploring possibilities for developing standards and recommendations specifically addressing the issue of cloud security.

The ITU-T Recommendation X.1601 on a ‘Security framework for cloud computing’ , adopted in October 2015, gives an overview of security threats and challenges related to cloud computing, and outlines a number of security capabilities that could be deployed against such threats and challenges. Some of the security threats and challenges described in the recommendation include: data loss and leakage, insecure service access, insider threats unauthorised administration assess, loss of trust, loss of confidentiality, service unavailability, loss of software integrity, and jurisdictional conflict. According to the recommendation, such threats and challenges could be tackled with the implementation of security capabilities such as: trust models for identity and access management systems that contribute to the confidentiality, integrity and availability of services and resources; interface security, ensured through mechanisms such as unilateral/mutual authentication, end-to-end encryption, and digital signatures; network security; data isolation and protection; incident management and disaster recovery; and interoperability, portability and reversibility.

As cloud computing technologies and services continue to evolve and be more and more used as an alternative to the local storage of data and applications, security challenges will also continue to grow. In this context, there is an intensification of efforts aimed at developing and implementing standards, recommendations, and solutions addressing the increasing security risks and challenges. As an example, the ITU-T Study Group 17 continues its work in areas such as: requirements for software as a service application environments, operational security for cloud computing, and cloud service customer data security.

Big data refers to large masses of data which require non-traditional data processing applications. Due to size and complexity, big data poses a challenge to analyse, store, transfer, and visualise; efficient analysis within required timeframes is also a major challenge. The need to adopt an international standard on big data has been linked to a global adoption of big data solutions.

In November 2015, ITU members approved the first ITU standard on Big Data. Recommendation Y.3600 on 'Big data – Cloud computing based requirements and capabilities' describes the meaning of Big Data and the characteristics of the Big Data ecosystem from a standardization perspective, and provides requirements, capabilities and use cases of cloud computing-based big data for large data sets which cannot be rapidly transferred and analysed using traditional technologies. The standard in fact outlines how cloud computing systems can be leveraged to provide Big Data services, thereby assisting the industry in managing large data sets.

Big data standardisation activities within the ITU falls under the responsibility of Study Group 13 – responsible for future networks, cloud computing and network aspects of mobile communications – within the ITU’s Telecommunication Standardization Sector.

Until the adoption of the new international standard, global standards were seen as missing a key ingredient; the adoption of such standard was viewed as a challenge in terms of global adoption of big data solutions in a wider range of scenarios. Although analysts believe that market forces push vendors to establish interoperability on their own before official standards are agreed upon, the newly adopted standard will aim 'to build cohesion in the terminology used to describe cloud-based big data, and to offer a common basis for the development of big data services and supporting technical standards.'

In technology, big data can make important contributions to development, and can help with relief efforts in cases of natural disasters or outbreak of disease. Researchers believe that big data analysis helps improve decision-making in areas such as health care, crime, security, and economic productivity.

Resources

Articles

The report outlines data security threats and concerns in emerging cloud, big data and Internet of Things technologies. Based on the results of a global survey conducted among over 1100 senior security executives, the report identifies the following as the main data security concerns: security breaches/attacks, increased vulnerability from shared infrastructure, lack of control over the location of data, privacy violations from data originating in multiple countries, protecting sensitive data generated by IoT.

Publications

The latest edition of glossary, compiled by DiploFoundation, contains explanations of over 130 acronyms, initialisms, and abbreviations used in IG parlance. In addition to the complete term, most entries include a concise explanation and a link for further information.

The book, now in its sixth edition, provides a comprehensive overview of the main issues and actors in the field of Internet governance and digital policy through a practical framework for analysis, discussion, and resolution of significant issues. It has been translated into many languages.

Papers

The paper looks at how the physical geography of cloud computing has affected the system of Mutual Legal Assistance Treaties (MLAT), and explains why the MLAT system, which is territorially-based, does not work in the context of cloud-based services.

The paper looks at approaches taken by governments with regard to cloud computing, and outlines several roles governments assume in this regard: users, regulators, coordinators, promoters, researchers, and service providers. It also contains several recommendations for policymakers to take into account when developing approaches to cloud computing.

The paper provides an overview of cloud computing as a technological innovation and innovation-enabling technology, and analyses legal and regulatory systems adopted in response to the evolution of cloud technology.

Reports

The study, based on a survey of more than 1700 decision-makers across small, midsize and large organisations around the globe, explores customer opinion and behaviour on issues such as cloud adoption, business priorities, and budget allocations for cloud infrastructures and applications.

The report, based on a survey of 1200 IT decision makers, looks at trends in the adoption of cloud computing within enterprises, and it explores issues related to cloud security (cloud security technologies, encryption, data loss prevention, etc).

The report offers an overview of the so-called ‘hybrid IT reality’, characterised by the migration of some infrastructure to the cloud, while some critical services continued to be maintained onsite. It also explores the barriers to greater adoption of cloud technology, as well as the challenges of migrating to the cloud and managing the performance of hybrid IT environments.

The report looks at how the use of encryption has evolved over the past years, and presents information on encryption trends and challenges, as well as encryption adoption rates in different countries. The report also looks into the use of encryption by companies that store sensitive or confidential information in the cloud.

The index ranks 14 Asian countries on the basis of 10 parameters that measure how prepared they are to adopt and roll-on cloud computing: international connectivity; broadband quality; power grid, green policy and sustainability; data centre risk; cybersecurity; privacy; government regulatory environment and usage; intellectual property (IP) protection; business sophistication; and freedom of information.

GIP event reports

Mr Andy Bates, Executive Director, United Kingdom, Europe, Middle East & Africa, Global Cyber Alliance, introduced the Global Cyber Alliance, and then stated how cybercrime has overtaken normal crime in terms of economic value. Despite the increasing economic risk of cybercrime, he argued that ‘cybercrime is just crime’, pointing out that it is crime adapting to modern tools. In his opinion, the responses should not basically differ too much from the measures taken to address other forms of crime. He highlighted that cybercrime is usually serial in nature, with many criminals potentially using the same vulnerability and being repeat offenders. He discussed the human psychological aspect in the context of phishing and spoofing emails as well as structural issues with the Internet.

He presented a tool called DMARC, which enables individuals and companies to register domains that then establish a handshake between actors to monitor email trustworthiness. In addition, he presented the Internet Immune System, a blacklist given to top level Internet service providers (ISPs) to track pages which contain malware. He argued that ISPs should work towards cleaning up the internet for individuals.

Lastly Bates outlined future scenarios, focussing mostly on the importance of sharing of information across private and public sectors, together with measures that would seek to prevent duplication. In addition to this he mentioned how reporting about cybercrime could be centralised. As a concluding remark he pointed out that individuals need to use common sense and intelligence when addressing cybercrime.

Dr Gustav Lindstrom, Head of the Emerging Security Challenges Programme, Geneva Centre for Security Policy (GSCP), gave a presentation which focussed on the issues and trends for future consideration in the field of cybersecurity. Firstly, he stressed that raising awareness needs to be a constant process. Due to its constantly changing nature, cybercrime should be seen as an emerging threat.

Lindstrom’s second point focussed on the key aspects of evolving technology and services which remain beneficial for us but also pose security challenges. He discussed many developments such as cloud computing, as the cloud is an attractive target for attacks. He described how the cloud can be used to hide malware. In addition to cloud computing, he mentioned how big data, through injecting false data, poses security threats in addition to the privacy issues. He also discussed the issue of 3D printing which can be used to circumvent existing measures, while providing potentially dangerous tools. Circumventing existing measures is also a risk posed by distributed ledger technologies. As a final aspect of this, artificial intelligence and machine learning, despite their ground-breaking advantages, run the risk of being misused and compromised.

The Internet of Things (IoT) can provide benefits, but it also opens the door for many new potential threats. Lindstrom pointed out how the shift in states’ cyber defence and offence poses a challenge. He argued that an increasing number of countries have developed capabilities to move from defence to offence, with roughly 30 countries having dual capabilities, but this number is hazy as is the boundary between defence and offence. As such, Lindstrom suggested, offensive cyber operations will likely increase and cyber weapons might be updated at a fast pace, especially in terms of delivery mechanisms. As a final point, while there are differences in state capabilities, all countries will try to seek to utilise zero-day vulnerabilities to their advantage. He then concluded his presentation by pointing out the increasing role of the private sector in the field, which is not only due to financial aspects but also due to the proliferation of public-private partnerships.

Other resources

The guide provides information to SMEs and microentreprises on issues related to: benefits of cloud computing, the building blocks of the technology, best practice guidelines regarding the adoption of cloud computing, tips for choosing a cloud service provider, and the European legal framework in the field.

The document is aimed to be used as a self-regulatory harmonising tool which offers a structured way to communiucate the level of personal data protection offered by a cloud service provider to its customers. It is based on EU personal data protection mandatory legal requirements.

The document is intended to assist companies in analysing cloud service agreements offered by cloud service providers. It looks into aspects related to: the relationship between the customer and the provider, the acceptable use policies, and the service level agreements.

The document is intended to provide guidance to companies in their processes of evaluation and comparing security offerings from cloud service providers. It looks into issues such as: security and privacy challenges in cloud computing; threats, technology risks and safeguards for cloud computing; methods for assessing the security capabilities of cloud providers.

The guide is intended to assist companies that are looking into adopting cloud computing, by providing them with detailed information on the characteristics and benefits of cloud computing, as well as with recommendations concerning the implementation and deployment of cloud computing technologies.

GIP Digital Watch

Submit Content

The GIP Digital Watch observatory reflects on a wide variety of themes and actors involved in global digital policy and Internet governance. We welcome information and documents from your organisations. Submitted content will be reviewed and published by our team of knowledge curators.
You can submit your content at digitalwatch@diplomacy.edu