23-Jun-2022
Data is omnipresent, and information access has become one of the most significant occurrences in recent history. We live in an era where cloud computing is permeating every sector and industry. On a global scale, it provides greater flexibility, reduced costs, and better access to resources. Ever wonder how it came about and how it evolved? Millennials might like to imagine that cloud computing is a thing of their generation, however, non-local computing has been around since the 1950s. Let us learn about the evolution of cloud computing here in this post.
As we continue to investigate how our new technology era is evolving, the societal benefits of cloud-based data are continuously being revealed. This expansion is becoming exponential as more and more complicated applications are no longer constrained to a single physical location.
We've seen the evolution from floppy discs to zip drives, from CDs (and data DVDs) to USB storage drives, and beyond in our lives.
Non-local storage technology, which began as a military mainframe, was created in 1950 to integrate computer terminals across an internal grid. It expanded swiftly after it reached the scientific community. This was a critical factor at a time when computing costs millions of dollars and the need for several individuals to use the technology became a requirement.
In a 1996 internal Compaq document, the term "cloud computing" was first used. When Apple launched General Magic in the early 1990s, the term "cloud" was first used to refer to the distributed computing theory, which had previously been used in academic writing. According to Computerworld, the concept was first proposed in the 1960s by J.C.R. Licklider, the first director of the Pentagon's ARPA division's Information Processing Techniques Office.
When Bob Taylor and Larry Roberts created ARPANET (Advanced Research Projects Agency Networks) in 1969, Licklider's concept went on to revolutionize computing and eventually become the forerunner of the internet that we know today.
Following the 1970s, a variety of Virtual Machines (VMs) were developed, including those produced by industry titans like IBM. Virtual private networks (VPNs) were subsequently made available to the market by telecommunications.
To have a more in-depth understanding of the history of cloud computing, let us break down the evolution of cloud computing over the years and decades.
MIT was given USD2 Millions for Project MAC (Mathematics and Computation) by The Defense Advanced Research Projects Agency (DARPA) in 1963. The funding stipulated that MIT create technology that would enable "two or more persons to use a computer concurrently." In this instance, one of those enormous, antiquated computers that used reels of magnetic tape as memory served as the forerunner to what is now often referred to as cloud computing. With two or three users accessing it, it functioned as a primitive cloud. This circumstance was referred to as "virtualization," although the meaning of the term was later enlarged.
J. C. R. Licklider contributed to the creation of the "very" early version of the Internet known as the ARPANET (Advanced Research Projects Agency Network) in 1969. JCR, also known as "Lick," was a psychologist and computer scientist who advocated for a future in which everyone on Earth would be connected to computers and have access to knowledge from everywhere. This future was known as the "Intergalactic Computer Network." Access to the cloud requires the Intergalactic Computer Network, also now known as the Internet.
The definition of virtualization started to change in the 1970s, and it is now used to refer to the creation of a virtual machine that functions exactly like an actual computer. As companies started renting out "virtual" private networks, the idea of virtualization changed with the development of the Internet. In the 1990s, the usage of virtual computers gained popularity, which prompted the creation of the current cloud computing infrastructure.
The cloud was initially intended to represent the void between the end-user and the provider. Cloud computing is the next "computer paradigm, where the limitations of computing will be dictated by economic rationale, rather than technical limits alone," according to Professor Ramnath Chellappa of Emory University in 1997. This lengthy explanation accurately captures the development of the cloud.
As businesses learned more about the cloud's capabilities and benefits, it grew in popularity. In 1999, Salesforce sprang to prominence as a successful application of cloud computing. They utilized it to develop the concept of delivering software to people over the Internet. Anyone with Internet access could access and download the program or application. Without leaving the office, businesses may buy the software on-demand and cost-effectively.
Amazon debuted its web-based shopping services in 2002. It was the first significant company to see operating at only 10% of capacity—a practice that was widespread at the time—as a challenge. They were able to utilize their computer's capability significantly more effectively thanks to the cloud computing infrastructure Model. Other major businesses quickly adopted their strategy.
Amazon introduced Amazon Web Services in 2006, which provides online services to other websites or customers. Storage, processing, and "human intelligence" are just a few of the cloud-based services offered by one of Amazon Web Services' websites named Amazon Mechanical Turk. Elastic Compute Cloud (EC2), another website provided by Amazon Web Services, enables users to rent virtual computers and run their own programs and applications.
Google introduced its Google Docs services in the same year. Google Spreadsheets and Writely served as the foundation for the original Google Docs. Renters can save, modify, and transfer documents into blogging platforms thanks to Writely, which Google recently acquired. (Microsoft Word is compatible with these documents.) A web-based application called Google Spreadsheets (bought from 2Web Technologies in 2005) enables users to create, edit, and share spreadsheets with others online. It makes use of Ajax-based software that works with Microsoft Excel. The spreadsheets have an HTML format option for saving.
In 2007, IBM and Google in conglomeration with several universities created a server farm intended for research projects that require processors and colossal amounts of data. The University of Washington pioneered by signing up and utilizing the assets offered by Google and IBM. It was then followed by MIT, Carnegie Mellon University, Stanford University, the University of California at Berkeley, and the University of Maryland. These academic institutions promptly realized computer experiments can be performed quicker at lesser cost with the support of Google and IBM. Most of the research objectives were of interest to Google and IBM, the arrangement was a win-win situation for the parties involved. The year also marked the arrival of Netflix with its video streaming service by the application of the cloud and catering to the binge-watching practice of the populace.
The year 2008 was marked by the arrival of the AWS API compatible platform, the first of its kind that was offered by the Eucalyptus. This was utilized in the distribution of private clouds. It was further followed by Open Nebula by NASA, which is the first open-source software for installing hybrid and private clouds. Most of its innovative features prioritized major businesses' needs and requirements.
Private clouds were started in 2008, although they were still in their nascent stage and not very well-liked. The usage of private clouds was however met with qualms over the lack of security in public clouds. Private clouds had been established by organizations like AWS, Microsoft, and OpenStack in 2010, and they were largely functioning. OpenStack also made a widely used, open-sourced, free DIY cloud platform available to the general public in 2010.
2011 saw the introduction of the hybrid cloud concept. The flexibility to move workloads back and forth between a private and public cloud is required, as is a decent amount of interoperability between the two clouds. Due to the tools and storage that public clouds could provide, many firms desired to do this but relatively few have the systems to do it at the time.
In support of Smarter Planet, IBM unveiled the IBM SmartCloud framework in 2011. Apple also introduced iCloud, which aims to store more private data (music, videos, photos, etc.). Additionally, Microsoft started advertising the cloud on television this year, educating the public about its capacity to store images or videos with simple access.
In 2012, Oracle unveiled the Oracle Cloud, which provides the three essential business services: infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), and software-as-a-service (SAAS) (Software-as-a-Service). These "basics" swiftly established themselves as the norm, with some public clouds providing all of them while others concentrated on providing just one. Software as a service gained a lot of traction.
In 2012, CloudBolt was established. This business is recognized for creating a hybrid cloud management platform that assisted businesses in creating, deploying, and managing both private and public clouds. They fixed the issues with public and private cloud interoperability.
By 2014, cloud computing had established its fundamental components and security had grown to be a top priority. Because clients value cloud security so much, it has quickly become a popular service. In the past several years, cloud security has progressed substantially and can now offer protection on par with conventional IT security systems. This includes safeguarding important data from unintentional loss, theft, and data leakage. Despite this, the majority of cloud customers' top concern is security, which may always be the case.
Application developers are currently one of the main consumers of cloud services. From being developer-friendly to developer-driven, the cloud started to change in 2016. Application developers started utilizing the cloud's tools to their maximum potential. To attract more clients, a lot of services work to be developer-friendly. Cloud suppliers created and still create the tools that app developers desire and need after realizing the necessity and the profit potential.
Although there have been very basic containers since 2004 (Solaris containers), they were severely constrained and only supported by specific computer systems. These tools didn't become popular until 2013 when Docker developed an incredibly useful container.
To make working with containers easier, hundreds of long-established tools were changed and used in 2017. One of these was Kubernetes, which Google created in 2014 and later made available as open-source software. A container-orchestration system called Kubernetes is intended to automate the management, scaling, and deployment of applications.
Internet use for remote work and eCommerce has highly increased as a result of the coronavirus outbreak. A fair prediction for the cloud's future is automated data governance software to handle the expanding number of internet laws and regulations
IT teams now have options that offer greater flexibility and reduced costs. By increasing resource usage and offering enormous benefits to clients of all sizes, cloud computing capitalizes on market demands. It guarantees greater scalability and dependability.
According to continuing research and development efforts, cloud computing has the potential to lead the pack, but there are some obstacles it must overcome if it is to continue growing.
Post a Comment