Cloud Computing Explained | History, Basics, Future, Types Saas, Paas, Iaas:- Many people ask me every time ‘what is cloud computing’? Cloud Computing has been evolved so much that now people are talking about it. People search the web they get the answer but then also they get confused by seeing technical definitions. If we talk about the technical definition of cloud computing it is something like this:
“Cloud Computing is an information technology pattern or example that enables ubiquitous access to shared pools of configurable system resource and higher level services that can be rapidly provisioned with minimal management efforts over the internet. It relies on sharing of resources to achieve coherence and economies of scale”.
By seeing the technical definition many people get confused and some didn’t understand so let’s understand cloud computing in very simple words that everyone could understand. Cloud Computing is not like that there is something in the clouds instead it is said cloud is made of small molecules which could move very fast from one place to another that is why we say it cloud. And Computing is the operations which a computer do to get you the results.
“So cloud computing is a connection of servers and computers which can connect with your device very fast & easily and give you the power of the powerful computer in your small or less powerful machine”.
Today, Cloud computing is becoming famous because you get a very powerful machine with the help of cloud in your palms with very less cost so that you could do anything anywhere with the help of internet connection.
History of cloud computing
The Cloud Computing started becoming famous in 2006 when Amazon released its Elastic Compute Cloud (EC2) in 2006 which is basically is a web service that provides secure, resizable compute capacity in the cloud. It is a new type of innovation in its type which most of the leading companies adopted and started working on there own project like this.
By mid-2008 Gartner world’s leading research and advisory company saw an opportunity for cloud computing. Gartner forecast that there are many possibilities in the Cloud Computing that will lead the future. After EC2 Google worked hard and released its Google App Engine Beta for testers in April 2008. This Engine is a web framework and cloud computing platform for developing and hosting web applications in Google-managed data centers. Applications are sandboxed and run across multiple servers.
In Feb 2010 Microsoft launched its Azure a cloud computing service for building, testing, deploying, and managing applications and services through a global network of Microsoft-managed data centers. It provides software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS) and supports many different programming languages, tools and frameworks, including both Microsoft-specific and third-party software and systems. in which Microsoft is working from early 2008.
In July 2010 Rackspace Hosting and NASA jointly launched an open source cloud software initiative known as OpenStack. OpenStack is a set of software tools for building and managing cloud computing platforms for public and private clouds.
On March 2011 IBM announced IBM Smart Cloud Framework to support Smarter Planet. IBM SmartCloud is a line of enterprise-class cloud computing technologies and services for building and using private, public and hybrid clouds.
SmartCloud offerings can be purchased as self-service or managed services.
On June 7, 2012, Oracle announced the Oracle cloud. Oracle Cloud delivers the broadest selection of enterprise-grade cloud computing solutions, including SaaS, PaaS, and IaaS.
In May 2012 Google Compute Engine was released in preview and finally launchedin December 2013.
- Cloud Computing was popularised with Amazon releasing its Elastic Compute Cloud (EC2) in 2006.
- Google Released Google App Engine Beta for Testers in April 2008.
- By mid-2008 Gartner saw an opportunity for cloud computing.
- In Feb 2010 Microsoft Azure in which Microsoft is working from early 2008.
- In July 2010 Rackspace Hosting and NASA jointly launched an open source cloud software initiative known as OpenStack.
- On March 2011 IBM announced IBM Smart Cloud Framework to support Smarter Planet.
- On June 7, 2012, Oracle announced the Oracle cloud.
- In May 2012 Google Compute Engine was released in preview and launched in December 2013.
Cloud Computing Basics
Cloud aims to cut cost and helps the users focus on their core business instead of being impeded by IT obstacles.
what is virtualization?
The main technology behind the cloud computing is ‘virtualization’. This software separates a physical computing device into one or more virtual devices each of which can be easily used and managed to perform computing tasks. This technology provides required speed to IT for its operations.
Cloud Computing is a kind of grid computing evolved by addressing the quality of Services (QoS) and reliability problems. This cloud technology provides tools and techniques to build data/compute intensive parallel application with much more traditional parallel computing techniques.
Some Basic concept of cloud computing:
1. Client-Server Model: This distributes an application that differences between service providers and service requestors (client services).
2. Compute Bureau: Services bureau providing computer services from 1960 to 1980.
3. Grid Computing: Form of distributed and parallel computing whereby a super and virtual computer is composed of a cluster of network loosely coupled computers for performing large tasks.
4. Fog Computing: Distributed Computing paradigm that provides data compute storage and application services closer to the client or near user edge devices such as network routes.
5. Mainframe Computers: The Computer used by the large organizations for large data processing such as Census, industry or consumer statistics, police and secret intelligence services, planning, financial transactions processing.
6. Utility Computing: The packaging of computing resources such as computation and storage as method service similar to a traditional public utility such as electricity.
7. Peer to peer: A distributed architecture without the need for any central coordination. No authority among people The data is preserved through suppliers and consumers of resources.
8. Green Computing: This type of computing emphasizes reducing the use of harmful materials in computing which is dangerous for the environment.
9. Cloud Sandbox: Alive, isolated computer environment in which a program, code or file run without affecting the application in which it runs.
Characteristics Of Cloud Computing:
1. Increases and improves users flexibility through re-positioning, adding or expanding technological infrastructure resources.
2. Cost reduction to a large extent as the amount is charged on the user’s basis.
3. Device and location independence as the user can access the system using a web browser regardless of their location or device they use.
4. Maintenance: The maintenance is very easy as it is not needed to be installed in each computer and can be accessed from different locations.
5. Multitenancy: This enables the sharing of resources and costs across large numbers of users which allows centralization of machines at a particular place
6. Peek load capacity increases: With the utilization and efficiency improved for the system that is often only 10-20% utilized.
7. Performance: It is maintained by the experts of Information Technology (IT) from the service providing company.
8. Resource Pooling: It provides computing resources are common to serve multiple consumers using the multi-terminal model with different physical and virtual resources. Dynamically assigned and reassigned according to user demand.
9. Productivity: This will certainly increase as multiple people work on the same data simultaneously instead of waiting for it to be saved and E-mailed. Time may be saved as information does not need to be reentered where field matched.
10. Increases reliability as it is suitable for the businesses continuity and disaster resourcing.
11. Scalability and elasticity: This is done with the dynamic allocation of resources and fine-grained, self-service basis in near real time.
Cloud Computing Model: SaaS, PaaS, IaaS
1) Software as a Service
Starting at the highest level of abstraction, we have Software as a Service (Saas). In this model, the whole shebang is in the vendor’s hands and you just use the service. These providers range from enormous enterprise-level software offerings like Gmail and Office365 Online down to micro-SaaS providers like one of my personal favorites: Freckle, which provides time-tracking for freelancers and teams. At this level, there is no installation of software, no updates, etc., just open your browser and go.
2) Platform as a Service
Next on the list is Platform as a Service (PaaS). In this model, you don’t want to think about the server or its internals, you want to point to a virtual machine, tell your code or container to go live there, and let your application take over from there. This is where Engine Yard fits in the scheme of things, along with Heroku, Openshift and others. In addition, most of the larger IaaS providers also have offerings in this area.
3) Infrastructure as a Service
Further down the chain of abstraction, we have Infrastructure as a Service (IaaS) providers. We run into the heavy iron here: Amazon Web Services, Microsoft Azure, and Google Cloud are the three dominant players, with IBM and VMWare playing catch-up. Here, the line between what you are doing and what the provider is doing gets thinner. You are typically using virtual machines on someone else’s servers rather than servers of your own. Which, of course, allows your servers to be anywhere your provider has a data center, allowing for lower latency, scaling, etc.
Type of Cloud Computing:
1) Private Cloud:-
Private cloud refers to a model of cloud computing where IT services are provisioned over private IT infrastructure for the dedicated use of a single organization. A private cloud is usually managed via internal resources.
The terms private cloud and virtual private cloud (VPC) are often used interchangeably. Technically speaking, a VPC is a private cloud using a third-party cloud provider’s infrastructure, while a private cloud is implemented over internal infrastructure.
Private clouds may also be referred to as enterprise clouds.
There is some controversy around the very idea of a private cloud. The central idea of cloud computing is an organization should not need to build out and manage computing infrastructure itself. By utilizing cloud vendors, an organization should lower costs while receiving services and applications that are on par or better than what could be done in-house. Given this, a private cloud would seem to be going backward. An organization would still need to build out and manage the private cloud infrastructure and not get any benefits from the economies of scale that should come with cloud computing.
The flip side of this argument is that not all organizations can give up control to third-party vendors. A proponent of private clouds would argue there are still significant benefits to private clouds in the sense that a private cloud is a way to centralize large installations of IT infrastructure in a highly virtualized manner while avoiding exposure to the unknowns of an outside cloud vendor.
2) Public Cloud:-
The most recognizable model of cloud computing to many consumers is the public cloud model, under which cloud services are provided in a virtualized environment, constructed using pooled shared physical resources, and accessible over a public network such as the internet.
To some extent they can be defined in contract to private clouds which ring-fence the pool of underlying computing resources, creating a distinct cloud platform to which only a single organization has access. Public clouds, on the other hand, provide services to multiple clients using the same shared infrastructure.
The most salient examples of cloud computing tend to fall into the public cloud model because they are, by definition, publicly available. Examples include:
- Cloud storage services
- Online software applications
- Cloud hosting, including website hosting
- Cloud-based development environments
Public clouds are used extensively in offerings for private individuals who are less likely to need the level of infrastructure and security offered by private clouds. However, enterprises can still utilize public clouds to make their operations significantly more efficient, for example, with the storage of non-sensitive content, online document collaboration, and webmail.
3) Hybrid Cloud:-
A hybrid cloud is an infrastructure that includes links between one cloud managed by the user (typically called “private cloud”) and at least one cloud managed by a third party (typically called “public cloud”). Although the public and private segments of the hybrid cloud are bound together, they remain unique entities. This allows a hybrid cloud to offer the benefits of multiple deployment models at once. Hybrid clouds vary greatly in sophistication. For example, some hybrid clouds offer only a connection between the on-premise and public clouds. All the difficulties inherent in the two different infrastructures is the responsibility of operations and application teams.
The idea behind hybrid cloud is that businesses can use it to take advantage of the scalability and cost-effectiveness offered by the public cloud computing environment without exposing mission-critical applications and data to the vulnerabilities associated with the public cloud option. In addition, the hybrid cloud model creates what is often the best and most efficient solution because different types of data can be moved onto whatever platform provides the most efficient and secure environment.
A hybrid cloud is typically created in one of two ways: either a vendor with an existing private cloud solution forms a partnership with a public cloud provider, or a public cloud provider forms a partnership with a vendor that provides private cloud platforms.
4) Community Cloud:-
A community cloud in computing is a collaborative effort in which infrastructure is shared between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party and hosted internally or externally. This is controlled and used by a group of organizations that have shared an interest. The costs are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud computing are realized.
5) Distributed Cloud:-
Distributed cloud is a cloud computing terminology used to describe an arrangement wherein the same file system is provided access to multiple clients. It also allows them to perform important operations like create, delete, modify, read, and write. Each file may be partitioned into several parts called chunks. Each chunk is stored in remote machines. Data is stored in files in a hierarchical tree where the nodes represent the directories. It facilitates the parallel execution of applications. There are several ways to share files in a distributed architecture. Each solution must be suitable for a certain type of application relying on how complex is the application, or how simple it is. Meanwhile, the security of the system must be ensured. Confidentiality, availability and integrity are important for a secure system. Nowadays, users can also share resources from any computer/device, anywhere and everywhere through internet thanks to cloud computing which is typically characterized by the scalable and elastic resources – such as physical servers, applications and any services that are virtualized and allocated dynamically. Automatic synchronization is required to make sure that all devices are up-to-date.
6) Public Resource Computing
Public-Resource Computing (p-r Computing) relies on personal computers with excess capacity including free disk space as well as idle CPU time. The idea of using those unused resources was proposed in 1978 by the Worm computation project at Xerox PARC. They used 100 computers to measure the performance of Ethernet there. Many academic projects followed to explore this approach including Condor, a toolkit developed at the University of Wisconsin for writing programs that run on unused workstations, typically within a single organization. The World‘s computing power and disk space is no longer primarily concentrated in supercomputers. Instead, it is distributed in hundreds of millions of personal computers and game consoles around the world. This paradigm enables previously infeasible research while also encouraging public awareness of current scientific research and the creation of global communities centered around a specific scientific interest.
Public-Resource Computing does not really belong into the category of peer-to-peer applications/networks because usually p-r projects rely on a central server architecture to produce work units and process the results. Also, the clients usually do not communicate with each other at all. Nevertheless, as p-r projects rely on the power and resources of client computers to do the work there are some similarities with peer-to-peer networks.
7) Volunteer cloud
Volunteer cloud, also known as peer-to-peer cloud and ad-hoc cloud, is the coordination of members’ computing resources to amass a cloud architecture for a single purpose.
The volunteer cloud model combines public-resource computing and cloud compute infrastructure for a distributed cloud that takes the place of centralized systems in a data center. Because these projects use participants’ resources at irregular times when their computers are idle, a volunteer cloud must be flexible and tolerant of variable capabilities.
Generally, volunteer cloud computing involves groups of people who donate their resources because they believe in the goals of a given cloud project. Nevertheless, promoting and motivating support for voluntary cloud computing projects can be one of a project’s greatest challenges. Some volunteer projects, such has Cloud@Home have tried paying volunteers for their resource donations; other projects have tried other incentives to participate, such as gamification.
A multi-cloud strategy is the use of two or more cloud computing services. While a multi-cloud deployment can refer to any implementation of multiple software as a service (SaaS) or platform as a service (PaaS) cloud offerings, today, it generally refers to a mix of public infrastructure as a service (IaaS) environments, such as Amazon Web Services and Microsoft Azure.
Initially, many organizations pursued a multi-cloud strategy because they were uncertain about cloud reliability. Multi-cloud was and still is, seen as a way to prevent data loss or downtime due to a localized component failure in the cloud. The ability to avoid vendor lock-in was also an early driver of multi-cloud adoption.
9) Big Data Cloud
The rise of big data cloud computing and cloud data stores have been a precursor and facilitator to the emergence of big data. Cloud computing is the commodification of computing time and data storage by means of standardized technologies.
It has significant advantages over traditional physical deployments. However, cloud platforms come in several forms and sometimes have to be integrated with traditional architectures.
This leads to a dilemma for decision-makers in charge of big data in the cloud projects. How and which cloud computing is the optimal choice for their computing needs, especially if it is a big data project? These projects regularly exhibit unpredictable, bursting, or immense computing power and storage needs. At the same time business, stakeholders expect swift, inexpensive, and dependable products and project outcomes. This article introduces cloud computing and cloud storage, the core cloud architectures, and discusses what to look for and how to get started with cloud computing.
10) HPC Cloud
High-performance computing (HPC) enables users to solve complex, computationally demanding business, engineering, and scientific problems through computer modeling, simulation and analysis. The applications used in HPC require high bandwidth, enhanced networking and robust compute capabilities.
An HPC system can be a single supercomputer or a distributed cluster of GPU-intensive servers in a cloud platform. The terms “HPC” and “supercomputing” are frequently used interchangeably, but the two differ significantly in performance levels. A supercomputer is computationally much faster than an HPC system.
Analyze large volumes of data such as financial calculations in an instant.
Save on costs by streamlining and optimizing your data processing tasks.
Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to concerns of commercialization, standardization, and governance of cloud computing applications. In practice, it leverages the methods and tools of engineering in conceiving, developing, operating and maintaining cloud computing systems and solutions. It is about the process of designing the systems necessary to leverage the power and economics of cloud resources to solve business problems.
Cloud engineering is a field of engineering that focuses on cloud services, such as “software as a service”, “platform as a service”, and “infrastructure as a service”. It is a multidisciplinary method encompassing contributions from diverse areas such as systems engineering, software engineering, web engineering, performance engineering, information engineering, security engineering, platform engineering, service engineering, risk engineering, and quality engineering. The nature of commodity-like capabilities delivered by cloud services and the inherent challenges in this business model drive the need for cloud engineering as the core discipline.
Disadvantages of Cloud Computing Explained
Downtime is often cited as one of the biggest disadvantages of cloud computing. Since cloud computing systems are internet-based, service outages are always an unfortunate possibility and can occur for any reason.
Can your business afford the impacts of an outage or slowdown? An outage on Amazon Web Services in 2017 cost publicly traded companies up to $150 million dollars and no organization is immune, especially when critical business processes cannot afford to be interrupted.
2) Security and Privacy
Any discussion involving data must address security and privacy, especially when it comes to managing sensitive data. We must not forget what happened at Code Space and the hacking of their AWS EC2 console, which led to data deletion and the eventual shutdown of the company. Their dependence on remote cloud-based infrastructure meant taking on the risks of outsourcing everything.
Of course, any cloud service provider is expected to manage and safeguard the underlying hardware infrastructure of a deployment. However, your responsibilities lie in the realm of user access management, and it’s up to you to carefully weigh all the risk scenarios.
Though recent breaches of credit card data and user login credentials are still fresh in the minds of the public, steps have been taken to ensure the safety of data. One such example is the General Data Protection Rule (GDPR), recently enacted in the European Union to provide users more control over their data. Nonetheless, you still need to be aware of your responsibilities and follow best practices.
3) Vulnerability to Attack
In cloud computing, every component is online, which exposes potential vulnerabilities. Even the best teams suffer severe attacks and security breaches from time to time. Since cloud computing is built as a public service, it’s easy to run before you learn to walk. After all, no one at a cloud vendor checks your administration skills before granting you an account: all it takes to get started is generally a valid credit card.
These practices will help your organization monitor for the exposure and movement of critical data, defend crucial systems from attack and compromise, and authenticate access to infrastructure and data to protect against further risks.
4) Limited control and flexibility
To varying degrees (depending on the particular service), cloud users may find they have less control over the function and execution of services within the cloud-hosted infrastructure. A cloud provider’s end-user license agreement (EULA) and management policies might impose limits on what customers can do with their deployments. Customers retain control of their applications, data, and services, but may not have the same level of control over their backend infrastructure.
5) Vendor Lock-In
Vendor lock-in is another perceived disadvantage of cloud computing. Differences between vendor platforms may create difficulties in migrating from one cloud platform to another, which could equate to additional costs and configuration complexities. Gaps or compromises made during a migration could also expose your data to additional security and privacy vulnerabilities.
Adopting cloud solutions on a small scale and for short-term projects can be perceived as being expensive. Pay-as-you-go cloud services can provide more flexibility and lower hardware costs, however, the overall price tag could end up being higher than you expected. Until you are sure of what will work best for you, it’s a good idea to experiment with a variety of offerings. You might also make use of the cost calculators made available by providers like Amazon Web Services and Google Cloud Platform.
Cloud Computing As an Emerging Trend
Cloud computing is growing very rapidly and today also it is a big subject of research. The Cloud Computing will soon capture the whole lot of industries. There is so much of scope in the cloud that today most of the companies are going toward the adoption of cloud and implementing in there several projects.
For Example, Microsoft has committed 90% of its $9.6 billion R&D budget to its cloud called Microsoft Azure.
Cloud Computing Future
1) Software will be king and hardware will be the servant
In few years there will be no need of powerful high-end computers in homes or offices as people can do high performing task through a simple software which is connected through servers which are located far from us in the data centers.
People would be able to play high-end games, do the professional level business calculation, high-end operations, weather forecasting, surgeries and many more through there small, less powerful device through software working on cloud computing.
2) Software will be wholly developed over cloud
Cloud computing will enable us to develop anything on the cloud servers easily and on the go. With simple installation and sharing option, it will be the best development platform of everyone.
3) Prices of Cloud Servers will go Down
People would be able to purchase cloud servers at very reasonable prices as the hardware prices will go down over time which would be very beneficial for us.
4) Security would be a major problem
The security of the cloud servers could be a challenging task, as hacking and cracking of the servers could lead to huge data loss which could cost millions or billions of dollars.
5) Data can be easily sharable throughout the world
The cloud computing will mostly benefit the companies as they would be able to share the data with the millions of people in seconds with any speed limit.
After the arrival of cloud it has revolutionized the whole IT sector throughout the world. Today millions of people are researching about the cloud, people are trying to understand the cloud in the best possible way so that they can implement it in there life’s and businesses.
Cloud computing is so fast and reliable that it can do your task in a few seconds, the best part is that it is cost effective.
Soon In just a few years from now, cloud computing will be so big that the whole world will do its job in the cloud and the whole world will get benefited with cloud but the most benefited people in the cloud will be of IT sector.
What are your views on Cloud Computing?…Comment in the comment section down below.