Computing in the Cloud: A Practical Review

If you were to read everything that you can, listen to every available expert and ask the really tough questions about cloud computing, you may draw the same conclusion that we have drawn. Cloud computing is like high school kids driving muscle cars: everyone is talking about it, very few are actually doing it and nobody is good at it. The main issue we see is that the various industry groups are going in their own direction, doing their own thing, with only the loosest of coordination between the groups. And, when there is coordination, it is more a result of shared members than of any formal effort. At least in the last battle royale – the packet vs circuit debate – there were two clear thought leaders: the Internet Engineering Task Force (IEF) and the International Telecommunications Union (ITU). The US National Institutes of Science and Technology (NIST) may beg to differ on this point but we’ll discuss this more later.

Another important issue is the sheer size and complexity of cloud computing relative to its apparent simplicity. Stated more directly, cloud computing appears simple but is very complex. The apparent simplicity has caused many organizations to move to cloud computing somewhat blindly, driven largely by promises of simplification which will result in cost savings, or just sheer cost savings. Cost savings alone is often enough cause for celebration. And, interestingly, the results achieved by those organizations who “just go for it” are, unexpectedly, usually very positive. They may experience a rough spot or two but rough spots can be expected during any migration to, or adoption of, a new technology. And how is this possible? Any dim-witted manager who makes a mad dash toward cloud computing with a gold rush mentality, dragging their entire organization with them, should be banished, or worse, but this isn’t usually the outcome. Those dim wits have been saved in just the same way as many who rushed blindly toward IP networks have lived to tell the tale, and for the same reason. They have a “go to person”. That go to person may be staff, consultant, vendor or service provider but that go to person did all of the reading, learning, thinking, testing and planning, much of it on their own time. Or, the go to person is bruised, banged and battered because they were the cushion when the cloud computing project was on the rocks. But, that go to person kept a Wizard of Oz-like control – don’t mind that man or woman behind the curtain – and everything turned out just fine. Like most efforts as important as an organizational evolution in the way information is collected, accessed and managed, there is no middle ground: success is either won through great planning and execution or by slight of hand and tricks with mirrors. Either way it’s considered a success.

Where did this “cloud” thing come from, anyway?

One reason for the complexity/simplicity duality is the cloud model itself. The original intention of the cloud model, from its first use, was to hide the complex reality of what goes on inside a network. The cloud model came about more or less by accident in 1974 or 1975. IBM Account Managers back then were really highly technical engineers who had memorized some people skills. One particular IBM Account Manager had just returned from weeks of training on the new Systems Networking Architecture. He wanted to share his new knowledge with his customer and had enthusiastically filled a chalk board in the customer’s office with diagrams of IBM System 360 and System 370 mainframe computers in different locations connected via 9.6Kbps and 19.2Kbps leased lines and Front End Processors to other S/3xx computers and 327x terminal clusters. He had thrown in TSO, VTAM, CISC, LUs, PUs, SDLC and even X.25 for good measure. He then saw the look of total confusion on his customer’s face. That IBM Account Manager realized he had committed the ultimate IBM Account Manager mistake: he had created a barrier to the sale. His customer might just have to understand everything on the board before the customer would sign the contract: that could take months! Just then his people skills kicked-in. The IBM Account Manager took an eraser and erased all of the complex networking stuff sitting between the customer’s computers. The white chalk residue against the black slate of the chalk board looked remarkably like a cloud, all pretty and fluffy and white, only hinting at the darkness and complexity inside. The cloud model was thus born. It was then, as it is now, a way of saying “let the experts worry about it”.

When the World Was Young

One of the authors well remembers a class in high school when the teacher talked about “when the world was young”. The teacher talked about how interesting it would have been to be a part of the western push of the early European settlers as they went through what is now North Carolina, claiming their homesteads and naming rivers, mountains and valleys in the virgin territory. Later, in Study Hall, a classmate who was part Cherokee talked about how “full of it” the teacher was. The Cherokee had inhabited the same ground for thousands of years before the arrival of the Europeans and had already named everything quite well, “thank you very much!” He went on to point out that living there for generations before the Europeans ever arrived gave the Cherokee (properly Tsa-la-gi) people a knowledge and closeness to the land that would take thousands of years to cultivate from zero, if the Europeans even cared to. There is a similar issue in cloud computing. Those of us who have been in computing and networking from the beginning recognize the old things with new names. Are not many of the “innovations” of cloud computing similar to time sharing or integration or even outsourcing or the web? They are, and the parallel capabilities are accompanied by many of the same, unsolved, issues such as management, governance, security, billing and the rest of the list. In cloud computing, as in the European history in North Carolina, don’t be confused by the apparent newness of something or lack of experience. And, don’t hesitate to call on the natives for help when needed.

Ad hoc, ad loc and sid pro quo …

The Beatles’ Nowhere Man said, “ad hoc, ad loc and sid pro quo. So little time, so much to know!” Consider that at the time the Nowhere Man was created in 1965, there was barely a meg of memory on the planet and that experts were postulating the home computer revolution would include home computers that take up an entire room. Cloud computing has brought with it an immense growth in the amount that there is to know, making cloud computing thousands of times more complex than anything with which the Nowhere Man had to deal. How does one come to understand enough to successfully deal with the immense body of knowledge, let alone its explosive growth? There is a Rosetta Stone, of sorts, provided by the CloudStandards Wiki. Among other things the CloudStandards Wiki provides a centralized point for accessing information related to 13 different de jure or ad hoc standards bodies or standards influencers and their work. None of those standards, however, relate to lower layer cloud computing standards being shepherded through the standardization process by organizations such as IEEE, ITU or even ANSI, who are not listed on the CloudStandards Wiki but must be considered to get the full picture.

In addition to all of the reading, formal education must be a part of the learning process to help you bring logic and structure to what you have learned and to put it to best use … the first time.


Are we optimistic about cloud computing? Yes, we are. But we are also cautious and a bit skeptical, like many networking and IT professionals. We have seen a lot of this before. We have been the planners with good execution and we have also been the cushion that has kept the ship from crashing on the rocks. In many ways cloud computing is a bright new world but in many ways it is business as usual. The key element is to figure out which parts are which, to properly apply hard-won knowledge, combined with new insights and skills, to push networking to new heights of usability, new levels of ease of use and new low levels of cost that humankind has never before experienced. Cloud computing really does have the potential to deliver on all of these promises.


Editorial Note: Jim Cavanagh, who heads up the Eogogics Inc networking team, is the developer of the Eogogics Cloud Computing curriculum. Having lived through the Service Bureau era in the ‘70s and outsourcing’s heyday in the ‘80s and ‘90s, he is duly skeptical of Cloud Computing, reluctant to just mix up batches of “Cloud Computing Kool Aid” and pass it around. He approaches this topic not only from the holistic business-and-technical perspective but also as a realist, with plenty of real life knowledge about what makes a great Cloud Computing application as well as the Cloud Computing pitfalls organizations should avoid. Jim has 35+ year experience in IT and telecom that encompasses network strategy, planning, design, implementation, applications, security, trouble-shooting, and marketing.  He’s also a dynamic presenter and the author of seven books on IT and telecom including Cloud Computing: Success Stories and Cautionary Tales, due out in 2012. (More about Jim) … The article’s co-author, KK Arora, is the Founder and President of Eogogics Inc. and the inspiration for the Eogogics Cloud Computing curriculum. His career also spans 35+ years in IT, telecom, and IT/telecom education. (More about KK)

Sorry, comments for this entry are closed at this time.