Advertisment

Watch out for top five storage trends!

author-image
CIOL Bureau
Updated On
New Update

Key points to ponder at the time of your next storage-related decision

Advertisment

Steadily moving away from being a domain within IT, data storage is today a

critical business area–critical to the survival and growth of the enterprise.

However, storage technology is ever evolving and presents enterprises with new

options to choose from time and again. And, enterprises often find it difficult

to keep up with new technologies, let alone choose technologies that deliver

business value over a long period of time.

To reduce some of this mysticism surrounding storage technology, Dataquest

spoke to a few key vendors in the storage business and asked them what in their

view would be the leading trends driving the storage initiatives by enterprises

in the near future.

The vendors we spoke to were (in alphabetical order): BakBone Software, EMC,

HP, IBM, Legato, Network Appliance, Quantum, Sun, StorageTek and Veritas. The

results are the following five key trends that enterprises should closely watch

and pursue in order to create and make use of an highly efficient storage

infrastructure.

Advertisment

Primary Storage

The increasing

implementation of critical enterprise applications like ERP, CRM and data

warehousing combined with a need to consolidate data storage centrally for ease

of management has accelerated the growth of networked storage market in India.

Falling hardware component prices are beginning to make networked storage

affordable even to medium and small enterprises.

As Indian businesses find ways to move from a predominantly DAS-based

environment to a SAN or NAS-based infrastructure, NAS will find increasing

adoption in mid-sized firms, driven largely by storage consolidation.

Advertisment

One technology that can make SANs affordable enough for many medium

enterprises is iSCSI, which will find greater traction in the coming three to

four quarters. iSCSI is an IP-based alternative to Fiber Channel as a protocol

to access storage device from a server. Using iSCSI, enterprises can create

IP-based SANs that are cheaper and comparatively scalable and flexible as FC-SANs.

While FC-SANs with their clear performance advantage are expected to hold

their ground with large enterprises supporting them for mission-critical

applications, other applications with equally large data requirements will see

active deployment of IP-SANs. In the upcoming three to five quarters, small to

medium organizations with no legacy SANs are expected to increasingly adopt IP-SANs,

while large organizations will prefer to wait and watch as the technology

matures and references become available.

Backed by companies like Microsoft and Cisco, iSCSI is available in products

from vendors like NetApp, EMC and IBM. But, the lack of availability of iSCSI on

all storage components, for example in backup and recovery arena may hold back a

widescale adoption for another 8-12 months.

Advertisment

Ask for: iSCSI support in disk arrays and NAS boxes

Lookout for:Unified storage that bundles NAS and SAN features in a

single box, iFCP that allows Fiber Channel SANs to be interconnected via TCP/IP

networks of any distance, using standard Gigabit Ethernet switches and routers.

Backups

Advertisment

A competitive

business environment means that performance demands from storage infrastructure

are only increasing by the day. For example, the sales persons may demand to be

able to update necessary information and retrieve latest figures late at nights

while on an outstation assignment.

As business uptime becomes increasingly dependent on IT uptime, supporting

the same may mean that backing up of data in a batch mode by shutting down

primary systems at times of low-usage may just not be possible now. And the

growing volume of data has added its own pressure on backup windows. Large

organizations with terabytes of data are already making a business case for

backup windows to shrink, to turn almost negligible in most cases.

A clever solution like disk-to-disk backup is expected to find favor with

enterprises looking for smaller backup and recovery windows. This is because the

technique makes use of cheaper ATA disk drives to add one tier to normal backup

routine–data is first backed up on a secondary disk from primary disk. A tape

backup is then taken from the secondary disk using normal backup windows, while

the primary disk is free to work for normal data requests.

Advertisment

Implementing adisk-to-disk backup solution may mean incremental costs, but

the upside is that one needs to take full-backups only once on a disk–the

remainder are incremental backups. And this results in far lower tape usage,

which ends up bringing the total cost comparable to that of a standard

tape-based backup solution. Also, the performance gains due to a disk drive’s

higher data transfer speeds will ensure the success of this solution with

enterprises who have both large volumes of data and need continuous uptime, like

financial institutions and BPO organizations.

In general, organizations with huge monolithic tape environments will prefer

to wait and watch before embracing the solution, while a new implementation is

likely to make use of the solution.

Ask for: Desktop and laptop backups

Advertisment

Lookout for:Continuous backups will mean no backup windows

DR and BCP

An area of

interest ever since the events of September 2001, Disaster recovery and Business

Continuity Planning is finally garnering its fair share of IT investments by

enterprises in the last few quarters. Two new trends are driving the uptake:

companies with offshore customers, particularly in software services and BPOs

face pressure from their customers to implement a sound DR strategy.

Also, CEOs of many large companies want to implement DR and BCP to increase

shareholders’ trust and value in their enterprise. The trend is only expected

to accelerate further given the increasing regulations and legal requirements in

many different industries for protection of business data.

The range of DR solutions being implemented will get increasingly

wider–from its simplest, archival of tapes at a remote place to a complex

environment involving cold and warm sites which reduce recovery times

dramatically although costing a lot more. The latter will find more favor also

with higher availability of cheap high-speed leased lines.

DR will not only be limited to business data alone as business applications

will also become a key factor in a sound DR plan. Consequently, as the type and

size of infrastructure required to implement a DR strategy goes higher, many

organizations that do not view IT as a core function to their survival will

seriously consider outsourcing their DR requirements.

Interestingly, DR is one of the few aspects of storage that can be outsourced

to a third-party provider, subject to factors like levels of trust and service

that can be quantified. However, many large enterprises and financial

organizations that view data protection as key to their survival will continue

to build their own infrastructure and invest in their own DR solutions.

Ask for: Consultancy on optimal RTO and RPO for your business, compare

in-house and outsourced DR solutions

Lookout for: More regulations on data protection

ILM

Information

lifecycle management is already being hailed as the biggest direction for

storage to move forward in near future. ILM builds on the premise that latest

sales data is far more critical to a business than last year’s salary records,

and makes a case of using different types of storage for different types of

data.

It warrants use of high-end storage only for more critical and valuable data,

while keeping the remaining data in cheaper storage. The strategy is

accomplished by investing in multiple types of storage devices from ultra-cheap

to ultra-fast, and a software that automatically transfers data from one type of

storage to another–based on its receding importance to enterprise.

Implementing ILM is also often classified in seven tiers.

Implementation of ILM-based strategies started last year itself in US and

many countries of Europe.

In India too, large enterprises with huge data volumes are beginning to

consider ILM, although wide scale and full implementations are expected to begin

in a period going beyond one to two years. What is more expected is a focus on

ILM in policies and processes depending on business requirements, for example an

organization may make use of automated tape to archive and transfer from primary

disk, all data that is 30 days older, leaving the primary disk free for fresh

data.

Enterprises may also look forward to align their storage decisions with the

principles of ILM, making modular investments in supporting technology as they

go along building their storage infrastructure.

Ask for: Scalability and interoperability across all tiers, media

management

Lookout for:Advanced automated software, content management,

intelligent indexing

Virtualization

Storage

virtualization is the magical wand that could allocate the required space to an

application’s data at just the right time from what would appear like a

standardized pool of storage, but which in effect is hiding multiple and

heterogeneous storage devices–from different types of disk arrays, SANs, NAS

boxes to HBAs and switches.

Virtualization can reduce the complexity of managing storage infrastructure

and reduce associated costs. True, universal virtualization enables one to

create volumes from any space on any storage platform regardless of the vendor

brand being used. Storage virtualization will allow use of storage capacity by

applications in both file-level as well as block-level input/output.

Although the term has been around for a few years now, virtualization

products available in market are only designed to work on homogenous platforms,

i.e. a solution from vendor X will work beautifully with X’s own disk arrays

and NAS, but might refuse to work with a box from vendor Y, Z and rest.

To combat this problem to some extent, many vendors forced alliances between

themselves to support each other’s products while at virtualization. However,

true virtualization is still quite a distance away. The growing importance of

SMI-S certification by SNIA, which will force a standard for interfacing data

across devices and platforms, is a step in the right direction.

Ask for: Ease of use, cost-benefit analysis, interoperability across

legacy infrastructure, SMI-S certification on all products

Lookout for:True virtualization (8-18 months)

tech-news