A trend that has grown over the past decade or so to become pretty much the default view of infrastructure is to “software-ize” functionality that was formerly the domain of specialist hardware.

It all started (arguably) with the idea of virtualization. Instead of needing a physical server for every task, software would allow numerous virtual servers to run on a piece of physical kit. The upshot of virtualization of servers was that far greater efficiencies could be generated, and utilization rates went from being dismal to almost absolute. All good outcomes if you’re worried about the economics of technology.

+ Also on Network World: Is infrastructure relevant in a software-defined world? +

But it wasn’t just compute that got this dose of software goodness. Next came storage, then networking. And seemingly the sky is the limit as to what parts of infrastructure can be made virtual. (And in the next realm of innovation, we have serverless computing where, in effect, stuff happens without even having to think about servers—physical or virtual. But that’s another story.)

To read this article in full or to leave a comment, please click here

Ben Kepes

Ben Kepes is a technology evangelist, an investor, a commentator and a business adviser. Ben covers the convergence of technology, mobile, ubiquity and agility, all enabled by the Cloud. His areas of interest extend to enterprise software, software integration, financial/accounting software, platforms and infrastructure as well as articulating technology simply for everyday users.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.