A minor storm erupted recently after Andrew McAfee published a post discussing the modern day application of the 1865 writings of William Stanley Jevons. In The Coal Question, a book about the consequences of greater energy efficiency, Jevons posited that greater energy efficiency leads not to lower total energy consumption, but instead to exactly the opposite outcome: higher aggregate consumption. McAfee then applies this theory to computing saying that if Jevons was right, and Moore’s Law continues to hold true, then computers are going to keep getting cheaper, and aggregate demand for them is going to continue to rise.

So far, so good. But then, Simon Wardley emitted a seemingly innocuous tweet, saying:

Anyone in the #cloud space should read [the article] before making the claim that “cloud will reduce IT expenditure”

Wardley was extending McAfee’s argument and claiming that a move to Cloud Computing will in fact lead not to a reduction in spending, but rather to an increase in consumption as unit price decreases. I’ve got two thoughts about this subject, one in relation to McAfee’s hypothesis, and one in relation to Wardley’s extension of the hypothesis.

Not all laws continue ad infinitum, not even Moore’s

McAfee seems to ignore some realities of both economics and physics in his assertion about the continued reduction in the cost of computers. There comes a point (and some argue that point is coming sooner rather than later) when some engineering realities mean that chips cannot continue becoming smaller and more powerful, when Moore’s law finally plateaus and when we meet some sort of equilibrium in terms of computing power. Contemporaneously, the industry is beginning to hit against some economic realities – where the cost of producing chips, and the devices they power, can no longer fall. It is at this point where. McAfee mentions these realities himself when he says that:

There’s some level of demand among companies for computational power, and until that demand is met investment will continue. Total investment could well increase for a while, even as prices dropped, because there’s so much thirst for computers, but after some finite period of time it’ll level off and maybe even start to decrease. After everybody’s got an Internet-connected PC and a smartphone, and after datacenters have centralized and moved into the cloud, total US corporate spending on computing gear will taper off, right?

So, by extension, at some point in the future we reach a time when unit price ceases to reduce. More importantly, at least in relation to he Cloud part of the discussion, is the cost/value aspects.

Cloud IS about reducing TCO

Per Joe Weinman and his 10 Laws of Cloudonomics, yes Cloud Computing is, per unit, cheaper than traditional IT. Even when it’s not. To quote Weinman;

Cloudonomics Law #1: Utility services cost less even though they cost more. Although utilities cost more when they are used, they cost nothing when they are not. Consequently, customers save money by replacing fixed infrastructure with Clouds when workloads are spiky, specifically when the peak-to-average ratio is greater than the utility premium.

In assessing the “Cost” and “Value” of cloud computing, it is important to realize that Cloud Computing, with it’s ease of use, scalability and rapid provisioning, allows projects to get the go-ahead which, under traditional IT models, simply wouldn’t have met the cost/benefit test. As such, the contention is that Cloud Computing will increase the net number of IT projects in businesses (a fact that my enterprise friends confirm). This being he case, IT budgets will not necessarily decrease, and may well increase, but the total units consumed (or projects undertaken, or whatever similar metric you wish to apply) will increase.

So, to answer McAfee and the others. Yes, potentially total IT expenditure will increase or stay the same BUT (and this is he important part) per unit cost will decrease, the number of projects meeting the cost/benefit requirements will increase and, most important, ROI will increase.

And that, dear readers, is way more important than a simply expense side number.(Oh and, by the way, more on this discussion over on Focus)

Enhanced by Zemanta
Ben Kepes

Ben Kepes is a technology evangelist, an investor, a commentator and a business adviser. Ben covers the convergence of technology, mobile, ubiquity and agility, all enabled by the Cloud. His areas of interest extend to enterprise software, software integration, financial/accounting software, platforms and infrastructure as well as articulating technology simply for everyday users.

1 Comment
  • Stuart Fawcett |

    Individual Personal Computing will get more expensive as soon as thin clients break into the computer mainstream, in the same way that laptops are out selling desktops in the mainsteam consumer sector. Your games machine/media server (will be replacing laptops) will provide local graphics processing and all information stores will be cloud backed. IT personal costs should reduce as only printers, terminals and some local continuity reside within a business.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.