Advertisment

Force Majeure and Storage

author-image
CIOL Bureau
Updated On
New Update

KOLLAM, KERALA: Vivekanand Venugopal, VP and GM, Hitachi Data Systems India Pvt Ltd helps us drill into the minefields of data technologies and what would really be a seismic force among factors like relevance of big iron, virtual zeitgeist and analytics hotspots.

Advertisment

What are the trends to watch for with real implications for your industry as you see them?

The realizations that have come because of economic uncertainties have started hitting customers around the areas of inefficiencies. At the same time there is a proliferation of data and the ability of customers to derive sense out of it will matter. Growth is on the rise, headcounts have dwindled in a comparative sense, Budgets are not rising, and time has become a scarce resource. So you need a framework to optimize all that. We are into that since 2007 with the ability to provide value to businesses. As per some trends we have been tracking, global economic uncertainty will require IT professionals to achieve better returns from their existing assets rather than buying new assets. There will be a greater focus on storage efficiency technologies such as storage virtualization, dynamic or thin provisioning, dynamic tiering, and archiving.

What is the big deal about big data we are hearing so much about?

Advertisment

The big hype for 2012 will continue to be around “Big Data”. The explosion of unstructured data and mobile applications will generate a huge opportunity for the creation of business value, competitive advantage, and decision support if this data can be managed and accessed efficiently. The massive size of Big Data sets will make it impractical to replicate, back up, and mine through traditional means. Big Data will be more about the information that can be derived from the intersections of many data sets or objects. In 2012, there will be greater adoption of content platforms in preparation for Big Data analytics.

Another big trend is the sudden and huge interest in analytics. What do you make of that?

Growth of storage is so high that to derive information efficiently and effectively is a challenge. That’s why there is proliferation of analytics. Hitachi’s approach is that you need a framework for you to protect and derive information from the vast pool of data, which is possible only when there is a strong foundation laid down clearly.

Advertisment

So where does storage come into this frame? In an on-stream way or back stream side?

You will need it both ways. To make sense of data you need a competitive advantage and that has to be done through various angles like tagging Meta data, proper frameworks and good foundations of storage.

Another big debate is around the obsolescence of mainframes. Would they be written off, or not and how would it affect companies like yours in the industry dynamics?

Advertisment

Some things are expensive to buy but cheaper to own. You need to understand what your current baseline is. Look at an architecture that can reduce or affect that challenge in the right way. Technologies may come and go but if architectures are prudently chosen, that is not just based on pricing mindsets, then companies can survive and lead the market. Obsolescence then, is not a force to worry about.

With the advent of virtual world, how can storage undergo the transition from the physical lineage, if any?

It depends on what level one is present at. If your business can accommodate a minimal amount of downtime then chance of virtualization are high. Hitachi’s architecture has not changed. It’s like the car industry. So many changes in design happen at such a sharp pace. But the car still is about four wheels not five o three.

Advertisment

As far as virtualization goes, server and desktop virtualization will increase the need for enterprises to scale up storage systems, non-disruptively as physical server demands increase. Modular storage systems will need to be replaced by enterprise storage to service the tier 1 demands of virtual servers. Scale-out storage architectures will not be able to meet the scale-up demands of server and desktop virtualization.

{#PageBreak#}

What do typically Cloud-area factors like security or outage incidents translate into for storage?

You need to look at security holistically. The access, protection and management of data are important.

What is your biggest challenge then among all the big changes that are happening above the surface?

Customers have started investing in products. They should start investing in architectures. One needs to apply storage economics. This will tell you what is going to give diminishing returns. Price does not equate to cost here. Our architectures, as I said have not changed, like the wheels of a car. But they have evolved at the pace that new changes mandate.

Advertisment

The well known industry major into storage has also reinvented itself beyond storage as it extends actively into virtualization etc. How is your portfolio changing as it orients itself to competition and new trends?

The approach of Hitachi is about creation of technology. We design, manufacture, support and retire all storage infrastructures, i.e. all over the complete lifecycle. EMC, IBM etc are assemblers of technologies. A creator and assembler are two different spaces. Big ticket acquisitions and the pursuit of virtualization technologies show this direction. We do not bring any significant levels of disruption unlike them. Because we do not rip and replace. Our business model has shown a different level of business shift. Our revenue pie shows 50 per cent coming from software and services and not so much from hardware.