Advertisment

Separate fact from fiction for software-defined success

author-image
Pratima Harigunani
New Update
ID

Vivekanand Venugopal

Advertisment

COMPUTING is considered a science. But that hasn’t prevented the IT industry from creating its own myths. The emerging field of software-defined infrastructure is no exception, and one of the biggest and most erroneous of these myths is the idea that “one size fits all.”

The fact is that it doesn’t. And the reason is that, while there are some similarities between businesses, they are dwarfed by the almost infinite number of differences.

Take a business like an international bank, for example, with hundreds of branches and thousands of employees in multiple countries, and years or even decades of service. Such organizations also inevitably feature a diverse mix of technologies, ranging from legacy applications to the latest virtualized and software-defined solutions that leverage the cloud.

Advertisment

That’s not to say that a software-defined infrastructure isn’t appropriate. However, embracing a software-defined future becomes an evolutionary journey that requires careful planning to ensure a smooth migration, rather than a revolutionary event.

Keeping the legacy systems operating requires access to skills that may be in decreasing supply as the market for such capabilities slowly wanes. That presents organizations with the choice of nurturing them in house or paying high prices to buy them on the street.

What’s more, while they are essential, these skills are usually not transferable to newer technologies, so there are no economies of scale to be had.

Advertisment

That might sound like an extreme case. However, even in businesses with fewer legacy concerns, the new software-defined infrastructure will typically grow up alongside the established systems, making the transition something of a balancing act.

This compares starkly with what we might call Web-scale IT organizations. These are typically younger businesses that started out with a “green field,” and have built virtualization into their DNA. Some may be completely service-driven and own none of the IT equipment, such as a data center or server farms. Until recently, these technologies were the defining characteristic of medium- to large-scale organizations with regional or global ambitions.

While they may be light on equipment, they are typically expertise heavy, with in- depth knowledge of software: its design, deployment and the best ways to reap huge economies of scale.

Advertisment

That makes software-defined infrastructure a more straightforward proposition, calling for a different “size” of solution to take full advantage.

Most of today’s companies fall somewhere between these two examples. They require solutions with at least some degree of tailoring to meet their needs. Trying to implement a one-size-fits-all solution would probably be a recipe for disaster rather than success.

Another issue for established firms is that the traditional “big iron” enterprise solutions such as Oracle and SAP are still critical business applications. The challenge today is to find ways to keep them relevant, more virtual, more mobile and enabled with software.

Advertisment

Using a software-defined approach can extend their lifespan significantly and give them new roles. This “application-led” approach allows businesses to add- on new solutions as needed, easily abstract information, and automate and simplify the entire infrastructure.

The end result is an IT implementation that genuinely meets an organization’s needs, rather than creating a hurdle that forces them to shape their businesses around the technology.

Infrastructure Is Not a Commodity

Advertisment

Another big myth is that, like consumer technology, enterprise technology has become a commodity. The idea is that the systems are more or less indistinguishable, so it doesn't matter what you choose, or your service provider implements.

Again, nothing could be further from the fact when it comes to software-defined infrastructure. Building a software-defined architecture on commodity off-the-shelf (COTS) hardware might be great for some application workloads, but not for all. For example, middleware application solutions, which merely act as aggregation or routing points, make use of COTS infrastructure in most cases. A similar argument applies for most open source analytics solutions and content platforms.

However, the same commodity hardware will not benefit traditional online transaction processing (OLTP) applications or newer in memory analytics platforms in terms of performance, real-time information access and insights, and ultimately business agility. Indeed, the economies of a COTS approach may be even less compelling when you consider the fact that hardware acquisition now accounts for less than 20 per cent of total cost of ownership of IT at most organizations.

And that’s a provable fact, not a myth!

(Vivekanand Venugopal is VP and General Manager, Hitachi Data Systems. The views expressed here are those of the author and CyberMedia does not necessarily endorse them.)

sdn experts