Where the cloud lives – A look at the evolving distributed computing environment

by Bill Kleyman
12 March 2013

The idea behind cloud computing has been around for some time. In fact, one of the very first scholarly uses of the term “cloud computing” was in 1997, at the annual INFORMS meeting in Dallas. However, the way the cloud has evolved into what we are capable of using today shows the amount of creativity and technological innovation that is possible. Distributed computing platforms are helping IT professionals conquer distance and deliver massive amounts of information to numerous points all over the world.

This is all made possible by a number of technologies all working together to bring cloud computing to life. Oftentimes, however, there is still some confusion around cloud computing – not so much how it works; but where it lives. Many will argue that virtualization gave birth to the cloud. Although server, application, network and other types of virtualization platforms certainly helped shape and mold cloud computing, there are other – very important – pieces to the cloud puzzle.

One IT landscape with many clouds in the sky

The very first concept that needs to be understood is that there isn’t one massive cloud out there controlling or helping to facilitate the delivery of your data. Rather, there are numerous interconnected cloud networks out there which may share infrastructure as they pass each other in cyberspace. Still, at the core of any cloud, there are key components which make the technology function well.

  • The data center. If the cloud has a home, it would have to be a data center. Or more specifically a neighborhood of data centers. Data centers house the integral parts that make the cloud work. Without servers, storage, networking, and a solid underlying infrastructure – the cloud would not exist today. Furthermore, new advancements in high-density computing are only helping further progress the power of the cloud. For example, Tilera recently released their 72-core, GX-72 processor. The GX-72 is a 64-bit system-on-chip (SoC) equipped with 72 processing cores, 4 DDR memory controllers and a big-time emphasis on I/O. Now, cloud architects are able to design and build a truly “hyper-connected” environment with an underlying focus on performance.

    Beyond the computing power, the data center itself acts as a beacon for the cloud. It facilitates the resources for the massive amounts of concurrent connections that the cloud requires and it will do so more efficiently over the years. Even now cloud data centers are trying to be more and more efficient. The Power Usage Effectiveness (PUE) has been a great metric for many cloud-ready data centers to manage the energy overhead associated with running a data center. More and more data centers are trying to approach the 1.0 rating as they continue to deliver more data and do it more efficiently. With the increase in data utilization and cloud services, there is no doubt that the data center environment will continue to play an absolute integral part in the evolution of cloud computing.

  • Globalization. The cloud is spreading – and it’s spreading very fast. Even now, data centers all over the world are creating services and options for cloud hosting and development. No longer an isolated front – cloud computing is truly being leveraged on a global scale. Of course, technologies like file-sharing were already a global solution. However, more organizations are becoming capable of deploying their own cloud environment. So, when we say that the cloud lives in a location – we mean exactly that.

    Historically, some parts of the world simply could not host or create a robust cloud environment. Why? Their geographic region could not support the amount of resources that a given cloud deployment may require. Fortunately, this is all changing. At the 2012 Uptime Symposium in Santa Clara, CA we saw an influx of international data center providers all in one room competing for large amounts of new business. The best part was that all of these new (or newly renovated) data centers now all had truly advanced technologies capable of allowing massive cloud instances to traverse the World Wide Web. This is a clear indication that, geographically, the cloud is expanding and that there is a business need for it to continue to do so.

  • Consumerization. One of the absolutely key reasons the cloud is where it is today is because of the cloud consumer. IT consumerization, BYOD, and the absolute influx of user-based connected devices have generated an enormous amount of data. All of this information now needs to span the Internet and utilize various cloud services. Whether it’s a file-sharing application or a refrigerator that is capable of alerting the owner that it’s low on milk – all of these solutions require the use of the cloud. Every day, we see more devices connect to the cloud. To clarify, when we mean devices – we’re not just referring to phones, laptops, or tablets. Now, we have cars, appliances, and even entire houses connecting to a cloud services. The evolution of the cloud revolves around the demands created by the end-user. This, in turn, forces the technology community to become even more innovative and progressive when it comes to cloud computing.

    As more global users connect to the Internet with their devices – the drive to grow the cloud will continue. This is why, in a sense, the cloud will eventually live with the end-user. Even now new technologies are being created to allow the end-user to utilize their own "personal cloud." This means every user will have their own cloud personality with files, data, and resources all completely unique to them.

  • Cloud Connectors. As mentioned earlier, there isn’t really just one large cloud out there for all of users trying to access it. The many private, public, hybrid, and community clouds out there comprise one massive interconnected cloud grid. In that sense, the evolution of the cloud created an interesting, and very familiar, challenge: a language barrier. For example, as one part of an enterprise grows its cloud presence, another department might also begin a parallel cloud project on a different platform. Well, now there is a need to connect the two clouds together. But, what if these two environments are on completely different cloud frameworks? It’s in this sense that we deploy the cloud “Babel Fish” in the form of APIs. These APIs literally act as cloud connectors to help organizations extend, merge, or diversify their cloud solutions.

    It’s not a perfect technology and even now not all cloud platforms can fully integrate with others. However, the APIs are getting better and more capable of supporting large cloud platforms. New technologies like CloudStack or OpenStack help pave the way for the future of cloud connectivity and APIs. Platforms like this work as open-source cloud software solutions which help organizations create, manage, and deploy infrastructure cloud services.

In a cloudy world – bring an umbrella!

Let’s face facts – it’s not always sunny in the cloud. As the technology continues to emerge, IT professionals are still learning some best practices and optimal ways to keep the cloud operational. This isn’t proving altogether easy. There are more users, more infrastructure, and more bad guys taking their aim at various cloud infrastructures.  Just as important as it is to understand how the cloud functions and where it resides – it’s vital to know the "weather forecast" within the cloud computing environment.

  • Attacks. Although this is the darker side of the cloud, it still needs to be analyzed. As more organizations move towards a cloud platform, it’s only logical to assume that these cloud environments will become targets. Even now, attacks against cloud providers are growing. This can be a direct intrusion or a general infrastructure attack. Regardless of the type, all malicious intrusions can have devastating results on a cloud environment. Over the past few months, one of the biggest threats against a cloud provider has been the influx of DDoS attacks. A recent annual Arbor Networks survey showed that 77% of the data center administrator respondents experienced more advanced, application-layer attack. Furthermore, such attacks represented 27 percent of all attack vectors. The unnerving part here is that the ferocity of these attacks continues to grow achieving a 100Gbps spike in 2010.

    Cloud services aren’t always safe either. On February 28, 2013 – Evernote saw its first signs of a hacking. Passwords, emails and usernames were all accessed. Now, the provider is requiring its nearly 50 million users to reset their passwords. The damage with these types of attacks isn’t always just the data. Evernote had to further release a response – and these are always difficult to do. In the responding blog post, Phil Libin, Evernote’s CEO and founder said the following, "Individual(s) responsible were able to gain access to Evernote user information, which includes usernames, email addresses associated with Evernote accounts and encrypted passwords." These types of intrusions can only help serve as learning points to create better and more secure cloud environments.

  • Outages. If you place your cloud infrastructure into a single data center – you should know that your cloud environment can and will go down. No one major cloud provider is safe from some type of disaster or outage. Furthermore, cloud computing services are still an emerging field and many data center and cloud providers are still trying to figure out a way to create a truly resilient environment.

    The most important point to take away here is that a cloud outage can literally happen for almost any reason. For example, a few administrators for a major cloud provider forgot to renew a simple SSL certificate. What happened next is going to be built into future cloud case studies. Not only did this cause an initial failure of the cloud platform, it created a global – cascading – event taking down numerous other cloud-reliant systems. Who was the provider? Microsoft Azure. The very same Azure platform which had $15 billion pumped into its design and build out. Full availability wasn’t restored for 12 hours – and up to 24 hours for many others. About 52 other Microsoft services relying on the Azure platform experienced issues – this includes the Xbox Live network.  This type of outage will create (and hopefully answer) many new types of questions as far as cloud continuity and infrastructure resiliency.

As with any technological platform, careful planning and due diligence has to go into all designs and deployments. It’s evident with the speed of today’s technology usage that the world of cloud computing is only going to continue to expand. Where the cloud lives - blog article by Bill KleymanNew conversations around big data further demonstrate a new need for the cloud. The ability to transmit, analyze and quantify massive amounts of data is going to fall onto the cloud’s shoulders. Cloud services will be created to churn massive amounts of information on a distributed plane. Even now platforms are being used around open-source technologies to help control and distribute the data. Projects like Hadoop and the Hadoop Distributed File System (HDFS) are already being deployed by large companies to control their data and make it agile.

With more users, more connection points and much more data – cloud computing lives in a growing and global collection of distributed data centers.  It is critical that cloud developers and participants select their data center platforms carefully with an emphasis on 100% reliability, high-density power, energy-efficient cooling, high-performance networking, and continual scalability. Moving forward, the data center will truly be the main component as to where the cloud resides.

Bill Kleyman

Virtualization and Cloud Computing Industry Expert