In 1963, DARPA (the Defense Advanced Research Projects Agency) presented MIT with $2 million for Project MAC. The funding included a requirement for MIT to develop technology allowing for a “computer to be used by two or more people, simultaneously.” In this case, one of those gigantic, archaic computers using reels of magnetic tape for memory became the precursor to what has now become collectively known as cloud computing. It acted as a primitive cloud with two or three people accessing it. The word “virtualization” was used to describe this situation, though the word’s meaning later expanded.
In 1969, J. C. R. Licklider helped develop the ARPANET (Advanced Research Projects Agency Network), a “very” primitive version of the Internet. JCR, or “Lick,” was both a psychologist and a computer scientist, and promoted a vision called the “Intergalactic Computer Network,” in which everyone on the planet would be interconnected by way of computers and able to access information from anywhere. (What could such an unrealistic, impossible-to-pay-for fantasy of the future look like?) The Intergalactic Computer Network, otherwise known as the internet, is necessary for access to the cloud.
The meaning of virtualization began shifting in the 1970s and now describes the creation of a virtual machine, which acts like a real computer with a fully functional operating system. The concept of virtualization has evolved with the internet, as businesses began offering “virtual” private networks as a rentable service. The use of virtual computers became popular in the 1990s, leading to the development of the modern cloud computing infrastructure.
Cloud Computing in the Late 1990s
In its early stages, the cloud expressed the empty space between the end user and the provider. In 1997, Professor Ramnath Chellapa of Emory University defined cloud computing as the new “computing paradigm, where the boundaries of computing will be determined by economic rationale, rather than technical limits alone.” This somewhat ponderous description rings true in describing the cloud’s evolution.
The cloud gained popularity as companies gained a better understanding of its services and usefulness. In 1999, Salesforce became a popular example of using cloud computing successfully. They used it to pioneer the idea of using the Internet to deliver software programs to the end users. The program (or application) could be accessed and downloaded by anyone with Internet access. Businesses could purchase the software in an on-demand, cost-effective manner without leaving the office.
Cloud Computing in the Early 2000s
In 2002, Amazon introduced its web-based retail services. It was the first major business to think of using only 10% of its capacity (which was commonplace at the time) as a problem to be solved. The cloud computing infrastructure model allowed them to use their computer’s capacity more efficiently. Soon after, other large organizations followed their example.
In 2006, Amazon launched Amazon Web Services, which offers online services to other websites or clients. One of Amazon Web Services’ sites, called Amazon Mechanical Turk, provides a variety of cloud-based services, including storage, computation, and “human intelligence.” Another of Amazon Web Services’ sites is the Elastic Compute Cloud (EC2), allowing individuals to rent virtual computers and use their own programs and applications.
In the same year, Google launched Google Docs services. Google Docs was originally based on two separate products, Google Spreadsheets and Writely. Google purchased Writely, which allows renters to save documents, edit documents, and transfer them into blogging systems. (These documents are compatible with Microsoft Word.) Google Spreadsheets (acquired from 2Web Technologies in 2005) is an Internet-based program allowing users to develop, update, and edit spreadsheets and to share data online. An Ajax-based program is used, which is compatible with Microsoft Excel. The spreadsheets can be saved in an HTML format.
In 2007, IBM, Google, and several universities joined forces to develop a server farm for research projects needing both fast processors and huge data sets. The University of Washington was the first to sign up and use resources provided by IBM and Google. Carnegie Mellon University, MIT, Stanford University, the University of Maryland, and the University of California at Berkeley, quickly followed suit. The universities immediately realized computer experiments can be done faster and for less money, if IBM and Google were supporting their research. Since much of the research was focused on problems IBM and Google had interests in, they also benefitted from the arrangement. 2007 was also the year when Netflix launched it’s streaming video service, using the cloud, and provided support for the practice of “binge-watching.”
Eucalyptus offered the first AWS API compatible platform, which was used for distributing private clouds, in 2008. In the same year, NASA’s OpenNebula provided the first open-source software for deploying private and hybrid clouds. Many of its most innovative features focused on the needs of major businesses.