Utility Computing – Overview & Analysis

What is Utility Computing?

Nick Carr published a book about it this year – called The Big Switch. Amazon [Web Services division] are the (surprising) world leaders in this area with a huge first-mover advantage and massive developer community -400,000 strong, not to mention the estimated 500m dollars revenue in 2008 they will generate from this. Effectively Utility Computing means accessing computing resources such as server infrastructure (Windows / Linux), storage, and other services that can be built on top of these (Business Continuity or Data Backup for example), on a ‘pay as you go’ model. The pay-as-you-go model is really where the ‘Utility’ part comes from and is the critical, game-changing, part of this new area.

Nick Carr compares Utility Computing to the consumption of and payment for electricity – you don’t have a generator at home constantly powered-on, generating ‘x’ kilowatts per hour; you use as much or as little electricity as you require, and only pay for what you have actually consumed. Computing is moving towards this model … at least raw computing resources anyway, and it’s going to massively disrupt the industry.

Why Utility Computing?

Why would you purchase expensive server infrastructure installations, that average running at 15% capacity, while at the same time you need to both pay for the capital purcahse as well as the running costs? If you could pay for resources you need, and scale your infrastructure pool up-and-down as your organisation needs to consume IT, you would make huge savings financially. You would expect to make great savings in capital (purchasing the infrastructure) and IT operations (running, maintaining, growing the infrastructure). Thus Utility Computing provides a model for consumption of IT in a manner that suits efficient business operation – just like consumption of any utility like water or electricity – take advantage of a providers scale facility and pay as-you-go.

Challenges and Opportunities Utility Computing Presents

1. Systems Integrators and Value Added Resellers

These companies make their bread and butter by selling servers and related appliances, with maintenance contracts, into organisations. Utility Computing presents a big challenge for them as their customers will no longer want to spend money on hardware, they will want Utility-billing. If they don’t change their game, they risk being sidelined as IT departments buy their compute resources directly from big Utilities, and pay as they consume.

I think there exists an opportunity here for innovative SI’s and VAR’s to offer On-Premise Utility Computing. Existing players are all Web-based, remote, and not very customisable. If a VAR were to offer the same level of service as businesses currently enjoy, by installing the infrastructure on-premise, but only charging for resources being utilised, I think they would have a compelling competitive advantage over the ‘Big Players’ like Amazon, and get a leap ahead of the old school companies that will eventually see their server-selling business dissappear. I would suggest that if a supplier here were to run the financial model, they may find Utility-billing does not reduce client revenue (over the life time of the contract) and creates a much more symbiotic client relationship that additional services can be sold though as a result.

Further opportunities then exist from the On-Premise, Utility-billed installation, for the supplier to provide ‘Bursting’ to Off-Premise Utilities, and provide additional services such as Data Backup, Remote Access, Business Continuty and Distaster Recovery, Unified Commications. I think we may see some new entrants to this market, if the SI’s and VAR’s don’t move fast enough to provide this service to their clients.

2. Hosting Providers and Data Centre Providers

Traditional Hosting Companies and Data Centre Colocation Providers will see their business affected by Utility Computing in 2009. Hosting Providers already save companies money on the Capital Expenditure of purchasing infrastructure – by leasing it to them as part of a services contract. Data Centre providers supply the raw space to connect a customers infrastructure to the Internet. Both of these areas will see customers moving to Utility-computing as a replacement for their services. DC providers will still get business from SI’s and VAR’s moving customer infrastructure to a Internet-delivery method (although this won’t last for more than another 3 years) and from bespoke infrastructure that standard Utility Computing simply cannot provide for.

Hosting Companies are far more threatened – Utility Computing as provided today by the likes of Amazon is a direct competitor, providing nigh-on the same service. The opportunities I believe are to provide a true Managed Service and ‘Managed Services’ SLA on a ‘Hybrid Utility Base’. I think monthly payments on a contracted term are still feasible, although will become much more difficult to win against Utility Computing offerings – but a model with a consultative sell and setup, high-end SLA and service, with a monthly-minimum for infrastructure that can ‘burst’ as needed, and the burst’s are paid for on a Utility Model, will be an opportunity for Hosting Companies to maintain their customer base, continue to grow and compete effectively with pure Utility Providers.

3. IT Departments

In-House IT organisations that currently spend a lot of time provisioning and maintaining server infrastructure will see Utility Computing outsource a lot of their day-to-day tasks. If servers are provided by a 3rd party, and their configuration, provisioning of additional servers and infrastructure environment maintenance are all taken care of by the 3rd party – IT Departments will have to justify their ongoing existence, certainly in their current form.

The challenge this presents is that company CFO’s will see this as a dual opportunity – to save money on IT capital and maintenance expenditure, and to reduce staff overhead in the IT department. The opportunity for IT Dept’s to mitigate this is to have a brainstorming session and create a plan for adding strategic value to the business. With resources freed up in terms of budget and time, the IT Dept. should propose how it can more tightly integrate with business management and operations – to once again try and be the catalyst providers of competive advantages (as IT used to be when it was just being introduced – a better IT setup mean’t a better business). In this way internal IT can be the instigators of the change to utility, and both save the company money while providing a higher value service – a ‘win win’.


A couple of interesting potential developments may occur. Hosting Companies, Systems Integrators and VAR’s may become much more like a single type of company with very little to differentiate them. Or, Hosting Companies will provide bare-bones infrastructure to existing SI’s and VAR’s as their Channel Partners. The SI’s and VAR’s will no longer try to sell any hardware but rather take a margin on the Hosting Company service they implement, and move up the value chain to ‘Strategic IT Partner’ rather than pure ‘Supplier’.

Utility Computing will certainly be a big player in the IT business in 2009, and on-wards. 2008 was the ‘early adopter’ and ‘buzz’ year … 2009 will see it enter the real business world.


0 Responses to “Utility Computing – Overview & Analysis”

  1. Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

RSS Mini-Blog

  • An error has occurred; the feed is probably down. Try again later.

Recent Links



%d bloggers like this: