Cloud Computing: The First 50 Years
Jul 19, 2024
To understand the future, you must also know the past. That's why we're taking a look at the first 5...

To understand the future, you must also know the past. That's why we're taking a look at the first 50 years of Cloud Computing in this TensorWave blog.
Many technological innovations that we think of as recent inventions got their start much earlier. Many people think that the computer mouse and graphical user interface (GUI) started in the mid-1980s with the Apple Macintosh, but in reality they were first developed in the late 1960s. What we now call “instant messaging” dates back to the early days of the UNIX operating system.
And what we call “the internet” didn’t start in the 1990s with the advent of the World Wide Web–the protocols on which all internet traffic depends were first developed in the late 60s and early 70s.
In a similar way, the notion of “cloud computing,” believe it or not, predates the likes of Google and Amazon by a wide margin. The term “cloud computing” is fairly recent, but the idea of sharing compute resources across a network goes back to the early days of electronic computers.
In this article, we trace the history of cloud computing from its early roots to the modern day. So strap in and let’s take a quick trip through cloud computing’s first 50 years.
The Early Years
Early computers were not able to communicate with each other, but they did have one feature that underpins the modern notion of cloud computing: Time sharing.
The typical setting for the early mainframe computers was universities and large research institutions. In the beginning, they could run only one program at a time, and there was almost always more demand from researchers for computer time than it could provide. To address this issue, mainframe computer providers added the ability to “time share,” or enable more than one user to access the computer simultaneously. Using sophisticated (for their time) prioritization algorithms, the computer’s processor could run different programs more or less at the same time.
In that same historical period, the U.S. government was considering the problem of getting computers to communicate with each other, even if they had different operating systems. An early iteration of what was then called “internetworking” was ARPANET, named for the Advanced Research Projects Agency, which funded the early research. ARPANET linked computers from different universities, using protocols such as Transfer Control Protocol/Internet Protocol (TCP/IP) that are still in use today.
These two concepts–time sharing and internetworking–laid the foundation for cloud computing as we know it today.
The Next Key Ingredient: Virtualization
Fast-forward to the mid-1990s. The World Wide Web had been born, and graphical browsers made the internet accessible not only to researchers and large businesses but individual users and small companies, including a new one that called itself “Amazon.”
In addition, the trend in computing architectures had moved from time-shared mainframes to the client-server model.
Computer engineers noticed that in the client-server model, as with the mainframe model before it, the server processors spent much of their time doing nothing. Other resources, such as memory, storage, and network, were also idle most of the time. However, it was not easy to share these resources among multiple users who might need different computing environments.
Thus was born the concept of the “virtual machine” (VM). A VM is a complete computing environment, with its own operating system, memory, software, and storage space. Several VMs can be hosted on a single physical server, sharing the physical resources and optimizing resource utilization. Cloud computing as we know it today could not exist without VMs.
The Birth of Modern Cloud Computing
By the late 1990s, all of the pieces were in place to enable providers to offer “cloud services” (so named because early network diagrams used a cloud symbol to represent the public internet). Early providers, such as Amazon Web Services and Azure, enabled customers to rent VM servers and storage, accessed and managed through a Web-based interface; this service came to be called “platform as a service” (PaaS).
Other providers, such as Salesforce.com, offered entire application suites, hosted on their own servers and accessible by browser. This came to be known as “software as a service” (SaaS). Other cloud-based innovations such as infrastructure as a service (IaaS) also emerged.
Cloud computing services continued to grow and evolve in the early 2000s, driven by innovations such as containerization. Containers (such as Docker and Kubernetes) take virtualization a step further by pre-defining virtual computing environments so that they can be re-deployed easily and automatically.
Another important innovation is called “serverless” computing. In this model, the servers (both physical and virtual) still exist, but they are managed in whole by the service provider. Customers focus only on the development or other tasks at hand without worrying about back-end infrastructure, and are charged only for the resources they use.
Adoption In the Enterprise
Adoption of cloud computing in the enterprise was slow at first. The idea of consigning your valuable data to a provider outside your “four walls” caused many executives to resist the move to the cloud. However, enterprises were eventually persuaded by the value proposition of cloud computing:
- Reduced costs related to purchasing, operating, and maintaining on-premise computing hardware
- More efficient resource utilization (such as the ability to turn off VMs when not in use)
- Effortless resource scaling (both up and down) as demand changes
- Easier disaster recovery and business continuity
Enterprises were also reassured by the development of sophisticated security tools optimized for cloud computing. These tools make cloud computing a viable option for sensitive and regulated industries such as healthcare and finance.
Taken together, these developments have turned many enterprises 180 degrees, so that they now think “cloud first” when contemplating IT expansion and transformation.
Cloud Computing Today: The Rise of AI and IoT
Today, cloud computing is seeing a continued rise in demand, driven by two important trends: artificial intelligence (AI) and the internet of things (IoT).
IoT observers predict that soon, billions of devices, from simple sensors to mobile devices to kitchen appliances to advanced robots will be networked together and communicating with cloud-based computing resources. Because the internet as it exists today may not be able to handle that much additional traffic, enterprises are deploying “edge computers”--on-premise devices that communicate with IoT devices and communicate summarized information to cloud computers.
AI, of course, was already making significant progress in every aspect of human life before the advent of large language models (LLMs) such as ChatGPT. Because of the extensive data storage and computing resources needed to train and test LLMs and other machine learning (ML) algorithms, it’s natural for AI developers to turn to cloud computing for these tasks. And, in fact, many cloud providers have been acquiring and deploying as many graphical processing units (GPUs) as they can get their hands on to meet the demand for compute resources.
This rise in AI cloud computing has been a major driver of the success of GPU manufacturer Nvidia, but it also means that many AI developers have found it difficult to acquire GPUs for their own use or to get time on cloud-based GPUs. This situation has opened the door for alternative AI cloud providers, such as TensorWave, and other manufacturers’ GPUs, such as AMD’s MI300X line and the ROCm software stack that simplifies AI development on AMD’s GPUs.
The Future of Cloud Computing
Cloud computing has come a long way from its humble beginnings over 50 years ago. What does the future hold for cloud computing?
For one thing, cloud computing is not a fad and it’s not going away. The value proposition is too strong and getting stronger all the time.
For another, AI is not going away either. Unlike the past history of AI, which consisted of overhyped and unmet expectations punctuated by long “AI winters” where no progress was made, we seem to have reached a tipping point where AI in various forms is now useful and reliable enough for practical applications. The need for cloud resources to support continued AI development will grow for the foreseeable future.
The final trend that is becoming important is the idea of “green computing,” driven in part by fears of climate change. Some observers predict that AI development will contribute a significant chunk of greenhouse gases in the atmosphere because it is such an energy-intensive activity. To counteract this trend, advocates are pushing for greener and leaner computing by making AI development more efficient and less resource-hungry. Look for important innovations in this space in the near future.
About TensorWave
TensorWave is a cutting-edge cloud platform designed specifically for AI workloads. Offering AMD MI300X accelerators and a best-in-class inference engine, TensorWave is a top-choice for training, fine-tuning, and inference. Visit tensorwave.com to learn more.