Journal Articles

CVu Journal Vol 17, #6 - Dec 2005
Browse in : All > Journals > CVu > 176 (12)

Note: when you create a new publication type, the articles module will automatically use the templates user-display-[publicationtype].xt and user-summary-[publicationtype].xt. If those templates do not exist when you try to preview or display a new article, you'll get this warning :-) Please place your own templates in themes/yourtheme/modules/articles . The templates will get the extension .xt there.

Title: Grid and Utility Computing - The Return of the Bureau

Author: Administrator

Date: 02 December 2005 06:00:00 +00:00 or Fri, 02 December 2005 06:00:00 +00:00

Summary: 

Body: 

I recently read a piece about Sun Microsystems 'Sun Grid' computing system. It was launched about a year ago with a great deal of razzmatazz. It offered computing power on tap at a cost of US$1.00 per hour per processor, and storage at US$1.00 per Gbyte per month. Grid and Utility Computing has been all the rage in the computer trade press and among the pundits for a number of years. Indeed, one commentator I read a couple of years ago claimed that the advent of grid computing would cause IT vice-presidents to undergo the same extinction as 'Electricity' VPs did in the twenties with the advent of the national electricity grid.

It was with a rather wry smile, therefore, that I read in the article that Sun was unable, even after running Sun Grid for a year, to name a single customer!

So What Are Grid and Utility Computing?

Sloppy usage - typical of marketing hype - has led to the two words becoming interchangeable, but I would suggest that they are both about the efficient utilisation of computing resources. Each approaches the problem from different ends, but most of the spin merchants are actually talking about utility computing, not grid computing. Grid computing is about tapping the unused processor power of existing computers, while utility computing is about having extra computing power on tap for peak usage, but only buying the resources you use, instead of having extra hardware lying around that is only used for brief peak periods.

Grid computing gained a big fillip with seti@home. This program brought together three things: the Internet, a supercomputing type application, and desktop computers not currently being used. The Internet was used to network the computers into something approaching a supercomputer to crunch masses of data pulled in by a radio telescope system. Its success in failing to find an extra terrestrial civilisation inspired the bloggerati to proclaim that this was the one true way forward (again).

The problem with this model is two-fold. First there is the design and programming problem. You have to be able to break the program down into discrete packets - lots of them - which can all be run completely independently. Now it is, of course, usually possible, not to say desirable, to break a problem up into independent parts (at least from a programming point of view), but there is a limit to which most problems can efficiently be broken down, and let's face it, computing power is only one of a number of limitations that a real life running program can face.

The other problem is that the number of processors available at any given time cannot be predicted in advance. This makes the concept useless for time bounded programs. It works just fine for searching for aliens that may or may not exist, or the speculative study of how proteins fold. For time bounded solution requirements, though, the computing power is just too unpredictable, which is probably why there is no weather_today@home program.

Utility computing is a totally different kettle of fish. The computing power is 'delivered' to your building by cable/fibre, you just plug in your terminal, log on to a remote server farm and run what applications you want.

It has a certain superficial attraction, especially to large companies with wildly fluctuating computing needs. If you are buying computing power only as you need it, then you don't have to make sure you have enough machines for peak consumption. There's also the advantage that you never pay for more than you use - and most business PCs are only used during working hours - i.e. for only a third to a half of the full day.

Utility computing is the natural successor to the large computing bureaux of the 70s and 80s. It is being pushed by the big computing companies, especially IBM, HP and Sun. It's in some ways difficult to see why they are so enthusiastic, because the implications for them if they are successful are not good. It may be that they simply haven't thought it through properly, which seems odd, but I suppose is possible. There are also major obstacles in the way and serious disadvantages from the consumer point of view.

Leaving aside, for the minute, everything else, let's assume that one of these companies succeeds in establishing utility computing as the way everyone gets their computing power. What then? Well the first thing to note is utilities of this nature are always natural monopolies, at the very least at a local level. In the industrialised world at least this has one of two consequences: either the utility is publicly owned, or if it isn't publicly owned it is heavily regulated. The latter is the most likely case in the US and the UK.

Do these big companies really want their activities regulated by local and national oversight boards? I cannot imagine why they would. And, interestingly enough, I can't think of any private utility company that hasn't tried to diversify -out- of its utility sector during the last 20 years. Indeed some of the most spectacular and massive corporate failures recently have been utility companies diversifying in search of larger profits than those allowed in their original business - Enron being only the most glaring example.

There is also the strategic question of whether putting all the data in a few massive data centres makes it more vulnerable to terrorist strikes in the post 9/11 and London Bombing period.[1]

But, over and above the dire consequences of success for the operating companies, there are serious flaws in the logic of utility computing. The most obvious question to ask is whether computing power is indeed the same sort of beast as electricity or water. I would suggest not, and for two main reasons.

First, it seems to me that the crux of the point is that it is easily possible for urban dwellers to obtain computing power relatively cheaply, while most cannot do so in the case of electricity or water.

A decent computer costs less than a washing machine. Few people - even companies - possess their own rivers, dams, coal mines, oil wells, or even the space to install a reasonable size generator. Interestingly, for instance, the London Underground was powered from its own power station at Lotts Road until late in the 20th century. It was economic for it to generate its own electricity, so it did so until the land it was sited on (Chelsea) became too valuable for industrial use.

As an aside, I would argue that if a new generation of electric generators which were both compact and cheap came to the market, we would see a steady move away from the electricity utilities by consumers. People, and companies, prefer to have their own resources rather than continually buy in resources from a utility. It's not just a financial thing - it's a matter of convenience too. How many people do you know who would rather use a launderette than their own washing machine, even though for most of the week the machine is unused?

The other problem is that computing power and storage are not a utility in the classic sense. The supplier doesn't push computing power or storage down a fibre optic pipe for you to use, like water or electricity. Quite to the contrary. Everything is at their end in the server and storage farms. This isn't a utility model - it's a computing bureau model, which everyone abandoned as soon as computing power became cheap enough to do so.

Then why are companies pushing utility computing, and why are big corporations starting to look interested?

Well, on the one hand, the big providers are seeking a way to re-establish the control over computing which they lost with the coming of age of the personal computer. On the other hand the primitive nature and lack of commoditisation of software (note - software, not hardware) makes anything that means you don't have to deal with Information Technology (IT) yourself looks attractive.

It is the latter that is currently driving the move to outsourcing by large corporations. And grid computing is in many ways a continuation of this trend. However, also notable is the struggle by a number of firms that outsourced their IT in the eighties and early nineties to bring their computing back in-house so they can regain control of their strategic IT. The more you survey the domain, the more obvious it becomes that there is a large dose of wishful thinking going on here. Clearly the protagonists - both grid buyers and sellers - really do believe that the grass is greener on the other side...

The computing bureau is dead. Long live the bureau!



[1] My more astute readers will recognise this as a stock sales weasel 'the terroists are coming' soundbite designed to part gullible government ministers from large quantities of public cash in return for several vats of digital snake oil...

Notes: 

More fields may be available via dynamicdata ..