Cloud – New Opportunities mean New Rules

By : |October 22, 2018 0

The idea behind cloud has been around a lot longer than the explosion in its use seen over the past decade. The idea that computing could be made available and consumed on a shared basis as a utility like electricity, gas or water can be traced way back to mainframe users pooling and sharing the IT resource available to them.

The emergence of the Internet, the technological evolution of networks, computing and storage and the vision/entrepreneurship of a number of major Internet players unleashed the full potential of the cloud. It has disrupted how IT is accessed, consumed and charged for and in the process it has disrupted also the role and profile of the data center. Cloud computing as it exists today has evolved to cater to many storage and computing requirements and it offers a multitude of options for its users. It is the execution of an idea which, like all the most far-reaching, is exceptional in its simplicity.

The cloud has eroded the traditional model of on-premise data centres as offsite cloud storage becomes the norm. Contrary to a number of early predictions, the ‘captive’ data center remains as part of the data infrastructure in most companies, as part of a hybrid IT model, as the nerve centre of a wider portfolio, as the repository of data that the company does not want to outsource or from sheer force of habit.

Cloud is seen to offer businesses the advantages of flexibility – consume and pay based on usage – as well as meeting increasing usage requirements without the need for major capital expenditure. To capitalize on this key benefit, cloud providers are offering services customised to the needs of clients, including different deployment models – public (shared), private (separated), hybrid (a mix of public and private) and multi-cloud (a mix of cloud-accessed services).

The challenge for CIOs is to find the best cloud model as per their business needs. Despite the maturity of cloud, there are still concerns as to its deployment based around issues of security (both of the network and the provider data center), service levels and contention for shared services and some concerns about the stability of VMs. Several high-profile outages have highlighted the vulnerability of very dense, highly networked and variable demand-based systems, although the vulnerability of hyperscale data centers are no different in kind to captive data centers.

Additionally, legislation is catching up late with the realization that data in the cloud can travel invisibly around the world to a provider data center well beyond the boundaries of the nation where the citizens who provided that data, live. The division of the world into cloud ‘zones’ is one means of meeting sovereignty requirements.

More locally, ‘private’ cloud has developed to overcome some of these hesitations in relation to access and security. There is some mention of companies that have faced higher charges than expected from cloud providers but this would suggest that the stage after technological maturity will be greater customer sophistication is knowing their way around the clouds) that they use.

For limited data requirements a pay-per-use model has seen traction. This may offer a CIO migration flexibility between competitive service providers in turn representing financial benefit to the customer. It also allows them to access the best of technology available. Many organisations bound by complex regulatory obligations and governance standards are still hesitant to place data or workloads in the public cloud for fear of outages, loss or theft or non-compliance.

And it needs also to be recognized that not all data is of equal importance or of equal complexity, and there has been a tendency to use the cloud to store data that needs to be kept but not accessed regularly, and which is repetitive rather than idiosyncratic in type.

The capability of cloud to deal at necessary levels of latency with complex and variable data sets and types will, in part, determine its role in the evolution of Edge. Public cloud computing lends itself well to big data processing of data derived from IoT devices and many providers are establishing facilities to take that on as an opportunity and a key customer step towards digitalization.

A hybrid cloud model though seems to be the most logical choice for many companies in the Indian market. It means that mission critical data remains on site and any workload burst or spike in demand is handled by the scalability of public cloud entities. Public cloud services share their infrastructure amongst numerous customers as it is a multi-tenant environment. Over time public cloud computing has proven the reliability of logical isolation thus increasing consumer confidence. Data encryption and addition of various identity and access tools has improved logical security.

India is one of the fastest growing markets in the Asia Pacific in terms of growth in cloud usage and investment. One of the APAC market leaders, Alibaba Cloud, has recently renewed its focus on India and increased its presence to capitalize on the significant business opportunities as digital transformation is expected to contribute US$ 154 billion to India’s GDP.

Over the last two years, the public cloud services market in India has seen phenomenal growth–reaching US$1.8 billion in 2017 up from US$1.3 billion in 2016. By 2020, this figure is expected to reach US$4.1 billion. Today, India is second only to China as the largest and fastest-growing cloud services market in Asia Pacific.

With a robust GDP growth rate, rapidly rising internet penetration and the necessary technological capability to design, construct and operate the necessary infrastructure, India is poised for booming expansion in the provision and consumption of cloud services and architectures.

By Nick Parfitt, Senior Global Analyst and co-authored by Navin Andrade, General Manager India, DatacenterDynamics

No Comments so fars

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.