Management Strategies for the Cloud Revolution

Book Management Strategies for the Cloud Revolution

How Cloud Computing Is Transforming Business and Why You Can't Afford to Be Left Behind

McGraw-Hill,


Recommendation

If you’re confused about “the cloud,” you’re not alone. The definition of the cloud is a subject of intense debate, even among its inventors. InformationWeek editor-at-large Charles Babcock explains exactly what cloud computing comprises, how it differs from pre-existing technologies and how it’s going to affect the way companies operate. He describes the seemingly limitless potential of cloud computing’s huge, cost-efficient data centers without minimizing the problems facing this emerging platform. A fundamental shift in digital services is taking place. BooksInShort recommends this book to anyone curious about the brewing technological storm.

Take-Aways

  • “The cloud” is a new kind of Internet data center with enormous resources.
  • Cloud centers cluster similar models of servers that require less servicing.
  • Companies can utilize the power of the large data clusters at an economical price.
  • The cloud works around failures, shifting processes to functioning servers with no service disruption.
  • Cloud computing offers economies of scale that are more cost-effective than maintaining corporate technology in-house.
  • The cloud’s unique appeal is elasticity, which enables it to expand for peak work flow.
  • Virtualization allows a cloud service to have many users operating on the same server.
  • Issues of cloud security and data integrity remain unresolved.
  • A “private” or “internal” cloud is one user’s data center designed on cloud principles.
  • The hybrid cloud comprises interaction between a private and a public cloud.
 

Summary

Cloudy with a Chance of Future

The definition of cloud computing – like a cloud – changes constantly. Even Andi Gutmans, CEO of Zend Technologies and co-creator of the most commonly used language on the Internet, PHP, is reluctant to define “the cloud.”

The cloud is many things. Simply put, it is a supersized Internet data center with enormous resources. Cloud computing is the interaction between an individual or a business and a service in the cloud.

“Whether you’re ready for it or not, cloud computing is coming to the rest of the world, and those who don’t know how to adapt are going to find themselves in the path of those who do and who are getting stronger.”

You’ll often find cloud data centers in warehouses filled with racks of servers the size and shape of pizza boxes or smaller. Google won’t publish the number of servers that fuel its search engine, but it did admit to having 45,000 servers in just one of its data centers. Microsoft acknowledged that one data center that supports its Azure cloud will hold more than 300,000 servers. What makes these immense data centers different from their predecessors?

“Like a river flowing from the mountains, the Internet ‘cloud’ provides resources to distant points without incurring any extra charge.”

Unlike traditional data centers, cloud centers use servers that require less maintenance; they are set up to work around hardware failure. When one server fails, its work shifts to other servers with no disruption. Another feature is “the Web’s setting of conventions for loosely coupled systems,” which means that two systems can run in the same cloud without knowing “very much about each other.” And users can activate servers from remote locations via the web.

“When it comes to the cloud, the run book – the document of best practices – is still being written.”

The cloud also offers economies of scale that are more cost-effective for a company than keeping all its technological activities in-house. For example, Amazon charges 8.5 cents per hour for customers to use its EC2 cloud, and Rackspace’s rate is about 1.5 cents per hour. These costs are so low because fewer staffers run cloud data than run traditional data centers. One system administrator can oversee hardware that processes hundreds of applications.

“New uses of computing will spring up continually as cloud resources grow.”

A unique feature of cloud computing is the user’s “programmatic control.” Users can program servers directly without a go-between and pay for it simply with a credit card. They can send the server instructions and data and “manipulate the results.” This peer-to-peer relationship will usher in a new era in which everyone has access to powerful servers to process their workloads.

“Shape Shifter”

On June 25, 2009, music superstar Michael Jackson died. When millions of fans began to congregate online to mourn his passing, they overwhelmed the enormous server of Sony Music (Michael Jackson’s record label). Twitter’s and Ticketmaster’s systems slowed to crawl. Under such peak pressure, the elasticity of the cloud is handy. Cloud computing gives customers access to “virtual machines” to handle spikes in business. Without the resources of the cloud, Sony would have to buy extra servers, plus the bandwidth and storage required to run them. This expensive reserve would go unused most of the time.

“It promises to be a broadly disruptive technology wave, changing the way companies relate to their customers.”

The cloud’s ability to expand or contract depending on workflow puts small companies with access to the cloud on an even standing with larger corporations. Moreover, customers pay only for the services they use instead of building and maintaining their own data center with surplus capacity.

Cloud data centers consist of parts from personal computers, including components that technology companies mass-produce. Cloud operators combine low-cost microprocessors (called x86 chips) into four, six or eight central-processing unit (CPU) servers. They cluster these servers together by the thousands. As of 2010, only a few companies were building large computer clusters and selling on-demand access. These include “Amazon Web Services, Google’s App Engine, Microsoft’s Azure cloud, the Rackspace Cloud, Sun Microsystems, IBM, Yahoo!, eBay and Facebook.” In the future, it’s likely that cloud suppliers will build interconnected megadata centers around the world.

Failure Avoidance and Virtualizations

Cloud designers solve failure problems that plague IT administrators. When tens of thousands of servers work together, failures occur. Thus, “fault tolerance” is a key ingredient of the cloud’s elasticity. Google, for example, installed “fault tolerant” software that routes work around failures. Managing failures becomes routine and doesn’t inhibit data center productivity. “The day may come when a giant cluster will fail, but its users won’t notice.”

“The cloud makes moving part of the data center workload outside the enterprise a practical alternative.”

A definition of the cloud would not be complete without a discussion of “virtualization,” which is “the process of taking a physical machine and subdividing it through software into the equivalent of several discrete machines. While these machines operate independently, they share the hardware resources of a single server without impinging on each other.”

“In the cloud, the illusion of an endless resource somehow becomes a reality.”

Virtualization enables many users to operate on the same server without overlap or meshing of data. The “hypervisor, a superb automated allocator of resources among competing demands,” oversees the virtualization process and acts as a liaison between applications and hardware. When operating business applications, the virtual cloud behaves no differently than a single physical machine. This turns the traditional notion of a data center on its head. A traditional center runs one application per server, leaving many cores idle. Cloud administrators can direct multiple servers from one console. Operators can, incredibly, move virtual machines from one server to another without any interruption in function.

“It’s this hybrid of private, on-premises clouds and public clouds – the potential for offloading work during peak activity – that highlights cloud computing’s potential value to businesses.”

The cloud also has changed the way end users prepare and enter workloads. Instead of an operating system sending instructions – add these two numbers, for example – to hardware, the hypervisor takes over that operating system’s function. The hypervisor instructs the hardware about what job to perform based on the requirements of the application. This innovation severs the link between a business application and an associated piece of hardware. IT managers now have the freedom to upgrade hardware without changing the application. And a hypervisor can run Windows, Linux and Solaris on the same multitenant server.

“Private and Hybrid Clouds”

With its enormous capacities and efficiencies, cloud computing seems too good to be true until you consider security. CEOs, who are responsible for shielding their business’s data, are wary of sending everything off into the cloud. Although protections exist, stakeholders would hold the individual business users accountable in the event of data or security corruption. This pervasive mistrust of the cloud’s ability to provide impenetrable security leads many CEOs to implement in-house cloud computing in a “private cloud.” Although a private cloud cannot achieve the same efficiencies gained with the operational scale of a public cloud, a private cloud is cheaper than traditional private data centers. “The adoption of private cloud computing is so young that it’s hard to talk about – it’s something that doesn’t exist fully, but is found only in skeletal, experimental form.”

“The cost of experimentation in the cloud will be slight compared to the cost of unfettered competition emerging in a field that you know nothing about.”

The cloud suffers security hazards and data integrity issues that this emerging industry has not yet worked out. If Amazon’s Elastic Compute Cloud (EC2), for example, suffers a hardware failure, Amazon moves the workload to another Amazon Machine Image (AMI) or virtual server. However, what happens if the failure affects all the servers in the same zone? Keeping your recovery system in a different zone is a wise practice but incurs an additional fee. Cloud operators are inexperienced, and multiple virtual machines sharing a physical piece of hardware presents problems never encountered before. Hackers will try to sabotage the system through invasive vehicles such as the hack attack by the Zeus botnet, “a remotely controlled agent” upon Amazon’s EC2 cloud. Amazon located and shut down the botnet, but such invasions are sure to continue.

“For many, its impact won’t be realized until long after its early adopters have had a long head start.”

Private clouds share the features of their public counterparts. Corporations construct them from cost-efficient PC parts run as a collection of servers via virtual management software similar to the hypervisor. To take advantage of developing services, businesses must be able to align with external resources. Wise companies will expand their x86 capacity in an effort to reorient their computing infrastructure to accommodate advances in the cloud. When private clouds can coordinate activities with public sites, that linkage will create the “hybrid cloud.”

“Companies must decide whether they are going to sit passively and watch a wave of disruption wash over their operations or harness the power of the cloud for themselves.”

Many companies will manage most of their computer functions on their private cloud. However, when inevitable peaks, called “cloudbursts,” occur, they’ll turn to the public cloud. If the data in the cloudburst is sensitive, they can offload applications that don’t require a high level of security and keep sensitive data running in their own center. For example, workloads that involve “software testing, quality assurance or staging of new applications” work well with public cloud outsourcing. Software development teams are always begging, borrowing and maneuvering server time to test new applications. The cloud, with its unlimited virtual machines and low cost, provides a solution for the testing dilemma. The cloud is also a good fit for analytics.

Obstacles

Public and private clouds should align smoothly. Unfortunately, obstacles, many of them artificial, currently prevent seamless interaction. IT managers cannot easily move workloads between their private data centers and a public cloud. Problems also stem from “incompatible file formats and proprietary moves by cloud service suppliers.” Front-runners will try to lock in customers by making it extremely difficult to use other vendors. For example, Amazon used open source code to construct its cloud, the AMI EC2. Then it modified the code, creating the AMI format that can run only on the EC2. Therefore, if a business wants to move its files from the EC2 to a competitor’s cloud, it must convert them to another format. Customers can’t use the AMI format in their private clouds, either.

“Once you’ve tapped into the cloud, you cease to be an isolated individual and you become part of a larger digital cosmos, where everything is linked to everything.”

Amazon isn’t the only vendor striving for “proprietary control.” VMware’s VMDK format is incompatible with other systems, and Microsoft, too, is maneuvering for exclusive advantage. Although the tech giants easily could modify the formats into one open standard, it probably won’t happen until cloud computing becomes the norm. However, customers can apply pressure by resisting vendor lock-in and demanding compatibility. Associations that promote open standards, such as the “Cloud Security Alliance” and the “Open Grid Forum,” are forming.

Why Fly in the Cloud?

Businesses have operated under the assumptions that computer resources are limited and expensive, and that using them properly requires specialized knowledge. With public clouds, however, companies can utilize the power of large data clusters at an economical price. Executives who grasp the ramifications of this radical change will gain an advantage over their competition. Those who don’t will lag behind. The economy is entering the next computing phase with the promise of fully integrated business performance. Information services, delivered inexpensively, will enable any size businesses to compete in a technology-based world.

“Digital culture will be with us from the moment we wake up until the moment we go to bed.”

Companies can start simply by using social networking in the workplace. A firm could set up a wiki site that’s open to all employees. Staff members can congregate on the wiki to share information, breaking down geographical and departmental boundaries. The wiki eventually would become the center of the business’s collective knowledge. Organizations also could use social networking as a recruitment tool.

The cloud also can strengthen the relationship between a firm and its customers. Businesses can woo and cater to new and repeat customers in ways that previously weren’t possible. For example, businesses can offer an array of options and services that allow clients to build a customized product that fulfills their specialized needs. The cloud can capture customer data and make it available to sales representatives quickly and accurately. This provides companies with valuable insight into how their customers think, feel and behave.

The cloud is going to reorder the way companies conduct business. How companies react to this disruption may affect their future success. Taking a business-as-usual approach could be a big mistake. Companies should begin exploring and experimenting with the cloud to see how they best exploit it. The cloud also will augment the power of the individual end user’s personal computers, putting the resources of this powerful networked data center at their disposal.

About the Author

Charles Babcock is editor-at-large for InformationWeek.