Webinar Report "European Cybersecurity: Cloud Security" - European Champions Alliance
post-template-default,single,single-post,postid-5315,single-format-standard,ajax_fade,page_not_loaded,,qode_grid_1300,footer_responsive_adv,qode-content-sidebar-responsive,qode-child-theme-ver-1.0.0,qode-theme-ver-10.1.1,wpb-js-composer js-comp-ver-7.3,vc_responsive,elementor-default,elementor-kit-6046

Webinar Report “European Cybersecurity: Cloud Security”

Webinar Report “European Cybersecurity: Cloud Security”

By Dominique Tessier and Karla Karathomas
The ECA Webinar of January 25, 2022, which has was followed by a hundreds of people, has given the occasion for a rich exchange, reflecting the quality and experience of our five speakers: Luc d’Urso, CEO of ATEMPO, Yann Lechelle, CEO of SCALEWAY, Ayman Khalil, COO and managing partner of RED ALERT LABS, Elmar Eperiesi-Beck, CEO of EPERI, and Marc Nader, CEO of EXEO.

What are the differences, regarding Cybersecurity, between managing information systems on premises and moving them into the Cloud? As soon as your data infrastructure is connected to the Internet, you are subject to attacks. That’s true for “on premises” management as well as having your data hosted by a cloud provider, in a 100% virtual state said Luc d’Urso. Still, there are differences: It’s in the interest of the Cloud provider to offer a high level of protection against attacks, and this provider has means that many “in house” users don’t have.

Basically speaking, the Cloud provider is responsible for the security of the infrastructure they provide, but not for the customer’s data. The provider makes sure the infrastructures they provide is not easily accessible or compromised and that data stored on it is not captured. The customer remains responsible of who in their organization can access which part of their information system and of their data and for taking care of the integrity of their data, whatever happens.

For instance, what about a crash of a server managed by the Cloud provider? Of course, the provider must undertake every effort to prevent such an event. But the customer must keep in mind that there is noway to 100% guarantee no technical error is ever going to happen. They need to define together with the provider how a B plan looks (redundancy in a separate data centre for instance). This is, according to Yann Lechelle, the Shared responsibility model, a model which must be fine-tuned depending on what the customer has chosen, IaaS, PaaS, SaaS …

This shared responsibility model applies also to GDPR compliance. The customer is responsible for keeping only the personal data he needs in order to operate, and for restricting access to them. On the other hand, if personal data he has kept in accordance with the regulations are disclosed due to a negligence of the provider, in this case the provider will bear responsibility.

When opening your infrastructure there is a risk, but there is a need to have a flexible infrastructure in today´s world, this is what the Cloud provides. Data risk is the price to pay for being able to run a business and having flexibility. It’s a risk management equation. Market velocity and elasticity are combined to determine the payoffs of a given flexibility-risk situation. Shared responsibility again: Each sides duties should be clearly determined in the contract between customer and provider.

Attack surface also needs to be considered, if the provider is big (expl: Microsoft) then the attack surface increases. For example, since US hyperscalers are gigantic providers, they have an equally large attack surface and there is a 100% certainty that they will get attacked. However, they have strong defence systems as well… which can be turned, as they have so many components including many 3rd party components which can be compromised (think of Microsoft Azure recently). Hardening should occur on both sides: Provider should harden their solution to also provide to high-risk clients, client with high risk must make sure all his data is protected (and redounded) in the safest way possible.

All this leads to consider 2 issues: The first one is the necessity for customers to resort to a good system fixing “who is who” and attributing specific access rights (IAM for ID and Access Management). An information system can cover many data domains: HR, commercial, R&D, production, finances… Should every employee have access to everything? In fact this question is valid wherever the data are stored, on private servers as well as in a Cloud.

The second one is protecting your data whatever happens. A good way to do that is encryption of important data. Different methods can be used, but in all cases the central issue are: who owns and keep the encryption keys, and when is encryption processed? In certain models, the Cloud provider proposes encryption as a service, which may sound easier to customers. But then, if this provider is subject to extraterritorial jurisdiction, customers’ data can in some cases be exfiltrated and decrypted. Elmar Eeperiesi-Beck’s recommendation is that customers keep the keys and encrypt before sending the data away to the cloud, which can be done by using an encryption gateway at customer’s hands.

Our panellists also answered a question on multi-cloud. As many organizations move their data into the cloud, they tend to consider putting their eggs in different baskets, that’s what multi-cloud is about. However, things are different, whether you simply put some domains in one cloud and some other in another cloud (for example: keeping your HR data in cloud #1 and R&D data in cloud #2), or whether you really split the data across two clouds. Such evolution makes data administration more complex, but also decreases the business impact of an exfiltration of data. However, whatever the case, the customer should make sure his IAM and his asset management systems encompass his different cloud infrastructures.

Thanks to Ayman Khalil of RedalertLabs, who is also an EU legislative expert for data privacy,  we covered the certification issue. European authorities, through ENISA, are working on defining a cloud cybersecurity framework. The work is conducted with consideration for what has already be done at national level in some member states. The objective is to have a scheme defined by June 2022, then coming into law by June 2023. When issued, ENISA rules will be mandatory for everyone in Europe and have layers so people can choose their compliance, from basic to high (high can be difficult for cloud provider to supply, so can apply to highly confidential domains). Customers will be able to compare providers by the same criteria and providers will be able to achieve a certification level they are ready to apply. ENISA has already issued a state-of-the art document, which supplies the basic wording and conditions for a lot of contracts and legal documents. This certification is also important as it sets a common language between customers and suppliers and official bodies. Providers have long list of products, each being a complicated technological puzzle and need to ensure certification for every product. Otherwise, large players profit more from strict and uniform certifications as they create a barrier for smaller players who need more nuanced certification standards to be able to make their products compliant.

The last word can be left to Marc Nader, who summarized his recommendations as such:

  1. Be sure of the reliability of the physical environment you use
  2. Have a good protection for your network access for uploading data and interconnecting multi-cloud and hybrid infrastructure.
  3. Resort to Identity & Access management
  4. Know which security services are made available by your cloud provider (FW, WAF, log management, etc.)
  5. Resort to your own Data Protection services (Encryption, key management, etc.)
  6. Check that your cloud provider complies with sufficient security standards
  7. Organize your back up (cloud to cloud for instance)
Andrea Vaugan