Top five strategies to improve storage capacity

By : |June 24, 2004 0

Anders Lofgren

Effective disk space management is a major issue as capacities continue to grow. In the past most organizations simply kept purchasing additional disk because it was relatively cheap and it was a quick fix. Effective capacity management was not seen as a critical area given all the other IT projects (Internet, ERP etc.)

However, with the slowdown in the economy, IT budgets are tightening and managing capacity by purchasing more hardware is no longer feasible. Cost reduction, effective utilization of current resource and, in general, doing more with less is the order of the day. At a high level there are several potential technologies, users can implement to address the need to manage growing capacities. Here are some good options:

Networked storage (storage area network (SAN) and network attached storage (NAS)): Although direct attached storage (DAS) is still the leading storage implementation, networked storage (SAN and NAS) is clearly the growth area and the direction of enterprise storage environments. SAN allows users to consolidate their storage into one storage area network instead of islands of storage connected to individual servers throughout the organization.

The consolidation will ease capacity planning and storage management as storage can be managed as a whole entity rather than as separate systems. It can also improve other areas such as remote mirroring and backup/recovery operations. NAS solutions can also be implemented for similar benefits, with a definite advantage on costs for smaller capacity environments.

The decision on SAN or NAS should be based on application and business needs. These technologies are converging into a storage network serving both file- based and block-based storage. We recommend all customers to implement networked storage as a foundation for future storage management capabilities.

Storage resource management (SRM): Storage resource management help users identify how much storage is being used, where it is located, how it is being used and who is using it. This allows organizations to understand their capacity utilization, which is typically poor ranging between 30 percent to 60 percent. SRM tools can identify unused storage assets and then customers can take steps allocate this capacity instead of acquiring new storage.

For example, if a 20TB installation identified its utilization as 40 percent and then increased utilization to 60 percent, the net new capacity is 4TB. Therefore, the organization has a cost avoidance of approximately 4TB of raw capacity that could be anywhere between $100,000 to $300,000 depending on the type of storage acquired. This is a very simplistic example but we have seen customers derive significant value from these tools.

In most cases, organizations will need multiple tools from multiple vendors to accomplish the task in heterogeneous environments. Some of the tools may not offer the same granular functionality as the device specific proprietary tool from the hardware vendor. The current leading vendors delivering products include BMC, Computer Associates, EMC, Hewlett-Packard (HP), Precise WQuinn, Sun, TrelliSoft (now owned by IBM) and VERITAS.

Storage virtualization: Virtualization technology has been the big buzzword in the storage market for the last 12 to 18 months. However, there has been little demand for these products to date. As a matter of definition, virtualization abstracts logical volumes from physical storage devices and presents these logical volumes to applications.

This abstraction layer can go across multiple physical subsystems allowing administrators to create logical volumes across several devices. It creates one pool of storage instead of many physical islands.

This can significantly improve storage utilization, reduce out of space conditions and greatly ease production applications interaction with storage. However, virtualization has issues – the market and technology is immature, there are no standards and little or no integration with existing storage management tools.
Virtualization is not an end but rather a means to an end; an enabling technology. Giga recommends users evaluate virtualization as a tactical tool today and revisit the idea of virtualization as part of a strategic storage management platform in approximately 12 months. There are a number of virtualization tools on the market but the most frequently mentioned are DataCore, HP, StorageApps and FalconStor.

Data life-cycle management: This term is used to describe the need of organizations to manage data from its creation to its deletion/archiving. Data importance and data access requirements change over the data’s lifetime; storage should reflect these changes to provide the most cost-effective solution.

As part of a strategic storage plan customers must consider ongoing long-term storage needs. The more immediate need of “hierarchal storage management (HSM)-like” functionality can be addressed by a number of vendors including VERITAS Storage Migrator, Legato Xtender products and StorageTek’s Application Storage Manager.

With the emergence of ATA disk drive products this will become an increasingly interesting area. Customers will be able to maintain most of the performance and availability of their current disk solutions but for half the price. ATA disk won’t be replacing high-end disk solutions but customers will find innovative ways to use these new products.

Investigate ATA based soln:We recommend customers investigate ATA-based solutions as fast backup/restore device fronting tape libraries, replication copies, archiving and fixed content. Relevant products include Network Appliance NearStore and EMC Centera (for fixed content). Expect other vendors to announce ATA products in the next six months.

In the short term, organizations should implement networked storage and SRM tools to maximize storage utilization and improve storage management. Virtualization technologies should be considered for tactical needs but not implemented as a strategic storage management platform.

That decision should be delayed for 12 months. Data life-cycle management is a long-term strategic initiative that most customers have not addressed to date. However, given current storage growth rates, customers should address this area in the next 12 to 18 months. Vendor solutions in this area are still relatively weak in terms of integration into an overall storage management infrastructure but we expect this will change.

No Comments so fars

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.