Site icon IT Vortex

The multi-cloud is here to stay – Here’s what you need to know

The multi-cloud is here to stay – Here’s what you need to know

The development of the cloud over the last 15 years is one of the most significant convergences of computing and communications technologies in history.

It provides unprecedented agility and scalability for organisations, immediate access to information and transactions for individuals and has transformed our global economy similar to smartphones and IoT devices.

This is according to Alain Sanchez, Senior CISO Evangelist at Fortinet.

According to a recent IHS Markit survey sponsored by Fortinet, however, organisations are moving applications and DevOps services back and forth between on-premises networks and cloud environments in a bid to figure out where and how it is most appropriate for them to use the cloud – a rather unusual curve in the current progression of cloud trends under CISOs.

Multi-Cloud – here to stay

Of the 350 companies surveyed, 74% had moved an application into the public cloud, and then decided to move it back into their on-premises or private cloud infrastructure. “This doesn’t mean they reversed all of their cloud deployments, just that they are encountering cases for bi-directional movement,” said Sanchez.

Many of these type deployments were, according to respondents, planned and temporary, originating from the needs of setting up a temporary infrastructure during an IT transition associated with a merger or acquisition.

The need to manage costs, shifting regulations, development of new applications, and changes in underlying technologies, were among other reasons which respondents highlighted why these deployments were necessary. But the top two responses – each selected by 52% of respondents – were performance and security.

Challenges in responsibility

While performance is likely to improve over time as practices building applications in the cloud improve, security is a more vexing problem as many companies don’t have a good handle on who is responsible for what.

The report shows many companies incorrectly hold their cloud provider responsible for higher layer threats (like APTs) affecting vulnerable systems they have chosen to deploy, where in fact the organisation itself is responsible.

According to Sanchez, in order to effectively address risk, it’s important to distinguish between the organisation’s responsibility to the cloud provider.

Challenges in technology

Another challenge is that security tools, functions, policies, and protocols don’t operate similarly between different public cloud platforms, private clouds, and physical infrastructures.

“Moving an application or service from one environment to the next may be straightforward, but many security solutions require a significant amount of IT resources to redeploy and validate a security solution, especially when workflows, applications, and data need to be inspected and secured as they flow between different environments,” said Sanchez.

What must be done

Resolving this issue starts with standardising with a single security vendor that provides solutions that runs consistently across the broadest possible range of public cloud, private cloud, and physical environments and that gives you a holistic view of your security.

Next, these tools need to run natively in the various public cloud environments to maximise effectiveness, while seamlessly translating policies, functions, and protocols between different environments using some form of cloud objects abstraction layer. These will yield the best results as existing security operational model remain applicable across a diverse and dynamic environment.

“Our mission as security technology provider is to make sure there is no compromise on security – we are vendor-agnostic and we aim to provide a permanent updated view and characterisation of the latest threats,” said Sanchez.

Exit mobile version