Cloud computing is the new buzz word in the IT industry. Different experts have given various definitions for cloud computing that is determined to rule the software industry in the coming years. Cloud is another word for internet and cloud computing refers to computing services provided on internet. Individual and business consumers can run applications and store massive amounts of data on the internet. Software giants like Google, Amazon and Microsoft will host the entire infrastructure needed to run applications and users can use their services only for the operations they need and for the time they want. Consumers and businesses will need not to maintain any infrastructure at their end. Storing data on the internet as documents, photographs and online videos is a form of cloud computing. Cloud computing will make the life easier for individual users, small and big commercial setups. Minimal startup costs, efficient utilisation of resources, speedy execution and parallel processing are some of the Benefits of Cloud Computing. Data security is a major concern for users relying on service providers to store their data on the clouds.
Basics of Cloud Computing
According to Buyya, Yeo and Venugopal ‘ A Cloud is a type of parallel and distributed network consisting of a series of interconnected and virtualized computers that are dynamically supplied and delivered as one or more unified computing resources based on service-level agreements negotiated between the service provider and consumers” (2008, p.2). Cloud computing will allow the consumers to use the computing services from a service provider as and when required and pay only for the time of their usage. Consumers especially the business houses have to spend a massive amount of money on developing and sustaining IT infrastructure to utilise computing services but cloud computing will eliminate this overhead. ‘Computer utilities’ will be available for use as any other utilities like water, electricity and telephone to homes and businesses. Software will be cultivated as a service, to be used by huge number of consumers in a way they use any other utility service. Software will need not to be run on individual machines like the present scenario (Buyya et al., 2008).
A cloud will act as a single access point for reaching all applications and data from anywhere around the world. Consumers will use computing services without any concern about what goes on the other side of the cloud. It will be the job of the service providers only to maintain a reliable and robust network to allow consumers access their services at any time. As Buyya et.al said, ‘ Clouds are simply next-generation data centers with ‘ virtualized ‘ nodes through hypervisor technologies such as VMs, dynamically ‘ enabled ‘ on request as a customized resource set to fulfill a specific service-level agreement, defined through ‘ negotiation ‘ and accessible as a composable network through ‘ Web 2.0 ‘ technologies. ‘ (2008, p.2). Consumers will use data centres to access computing services through cloud i.e. internet (Buyya et al., 2008).
Cloud computing requires service-level agreements (SLAs) between service providers and individual users. Agreements will also picture among other things Quality-of-service needs of individual consumers. Market-oriented resource management architecture will take over traditional system-centric resource management architecture to meet the Quality of Services demands for individual users. Service providers will have to use efficient resource allocation mechanisms to treat every service request in a special way (Buyya et al., 2008).
Market-Oriented Cloud Architecture
Fig. 3 shows high-level market-oriented cloud architecture composed of four basic components:
1. Users/Brokers: Users or brokers put forward their service requests to the cloud through the data centres. Users can do this from anywhere irrespective of their location on the globe.
2. SLA Resource Allocator: SLA resource allocator relies on a collection of mechanisms to act as an interconnection between the data centre/cloud service provider and external users/brokers. Mechanisms supporting SLA are:
- Service Request Examiner and Admission Control: It is the service request examiner and admission control mechanism that decides whether to accept a service request or not based on the status information retrieved from other mechanisms of the SLA resource allocator. As the data centre makes a service request this mechanism assesses the QoS requirements of the service request. While doing this its main focus is to prevent any overload on the resources. Service requests are rejected to prevent putting any overburden on the limited resources available. VM Monitor mechanism provides the resource availability and Service Request Monitor mechanism provides the workload processing status information. Service requests are accepted only when resources are available and QoS requirements of the service request can be met without putting extra load on the resources.
- Pricing: The job of the pricing mechanism is to decide the payment fee for each service request based on different parameters like submission time (peak/off-peak), pricing rates (fixed/changing) or availability of resources (supply/demand). Effective price mechanism is essential to prioritize the allocation and management of the resources in the data centre.
- Accounting: The accounting mechanism is very important to keep a track of the actual consumption of the resources by a service request. Final costs paid by the consumers are calculated by the accounting mechanism based on these statistics. Service Request Examiner and Admission Control mechanism can use the information maintained by the accounting mechanism to enhance its own resource allocation mechanism.
- VM Monitor: It is the job of the VM monitor to maintain information about the availability and associated resources of the VMs.
- Dispatcher: Dispatcher mechanism comes into picture when the service request has been accepted and VMs allocated. Dispatcher then begins the execution of the service request on the allocated VMs.
- Service Request Monitor: Service request monitor mechanism keeps a check as to which service request is in which stage of the execution.
3. VMs: One physical machine can act as multiple virtual machines at one time that are controlled dynamically. A single physical machine can hence execute several service requests in parallel. Different partitions of the resources on the single physical machine can be managed to meet special QoS requirements of various service requests. Several VMs can run simultaneously on one machine without interfering with one another’s operation. They can also use different operating systems and execute separate service requests.
4. Physical Machines: Physical machines are the servers that are equipped with various resources and allow several virtual machines on a single system. One physical machine can meet the demands of numerous different service requests.
(Buyya et al., 2008).
The major challenges faced by cloud computing are meeting the QoS requirements while providing computing services for business operations. Time, cost, reliability, trust and security are some of the decisive QoS parameters that change over time and need dynamic management (Buyya et al., 2008).
Fig. 3 (Buyya et al., 2008).
Various Cloud Platforms
According to several different surveys and studies cloud computing holds the future of the computing industry. As surveyed by Merrill Lynch, cloud computing is valued at a market potential of $160 billion, including $95 billion in business and productivity software and a further $65 billion in online advertising. A study by Morgan Stanley has also confirmed the bright future of cloud computing. Emerging demand and development of Platform as a Service (PaaS) and Software as a Service (SaaS) technologies among individuals and businesses has drawn the interest of several industrial and research organisations to develop different cloud platforms. Table 1 compares six different cloud computing platforms prominent in the market today:
- Amazon Elastic Compute Cloud (EC2): Amazon EC2 allows Linux-based virtual computing environment. Linux-based applications can either select the Amazon Machine Image (AMI) from a library of globally accessible AMIs or they can also build a new AMI. These AMIs can include applications, libraries, data and associated configuration settings. These AMIs are then uploaded to the Amazon Simple Storage Service (S3). Execution of the AMIs begins at this stage only. The user has to pay Amazon S3 only for the uploading and downloading of data while he has to pay Amazon EC2 for the time when the instance of the AMI is active.
- Google App Engine: Google App Engine supports Web applications created in the Python programming language. It supports the Python standard library, Application Programming Interfaces (APIs) for the datastore, Google Accounts, URL fetch, image manipulation, and email services. At present users of Google App Engine do not have to pay anything for up to 500MB of storage and around 5 million page views per month. Google users can monitor their active Web applications using the Web-based Administration Console provided with Google App Engine.
- Microsoft Live Mesh: Users of Microsoft Live Mesh can store their applications and data at a centralized location and access them on computers and mobile phones anytime anywhere. Uploaded applications and data can be accessed from any computer or mobile device using Live Mesh software. Microsoft has ensured proper security arrangements. Windows live login is used to authenticate a user’s password-protected access. Secure Socket Layers (SSL) protects all file transfers.
- Sun network.com (Sun Grid): The Sun Grid supports a wide range of applications. Applications based on Solaris OS, Java, C, C++, and FORTRAN can be processed. A local development environment similar to the Sun grid has to be used to develop and debug user applications and runtime scripts. The user then needs to upload the application and the associated scripts, libraries, executable binaries and input data in a zip format to the Sun Grid. Users can run and manage their applications using the Sun Grid Web portal or API. Results of execution are then downloaded to the local development environment.
- GRIDS Lab Aneka: GRIDS Lab Aneka is a .NET-based service-oriented platform for enterprise Grids. In order to be ready for any adjustment Aneka supports multiple application models, persistence and security solutions, and communication protocols. Computers need to have the Aneka container to develop an enterprise Grid. The Aneka container begins the execution of the services. Users can state their specific QoS needs such as a maximum time limit and funds for the service request through the SLA support of Aneka. The Gridbus broker allows the remote access of the Aneka Enterprise Grid and arbitration of QoS requirements with the service provider.
(Buyya et al., 2008).
Table 1 (Buyya et al., 2008).
Pic. 1 (Markoff, 2007).
Global Cloud Exchange
Cloud computing is still in it infancy and there are several limitations that users have to cope with. Businesses rely on cloud computing to increase the throughput of their systems and meet the growing demands and load on the system. Service providers have private interfaces to their services and hence due to the lack of standardisation users find it difficult to switch from one service provider to another. Service providers do not have flexible tariff plans and at one time users can depend on the services of a single provider only. There is an utmost need of standards and market infrastructure to allow more flexibility and options to the consumers. Fig 4 shows global Cloud exchange and market infrastructure for trading services.
Fig 4 (Buyya et al., 2008).
Various entities of the system perform different tasks to keep the system running. Participants can find providers or consumers with genuine offers through market directory. Participants submit their bids to the auctioneers who clear them regularly. Banks take care of the financial transactions between the service providers and the consumers. A broker is the middle man between the service providers and the consumers. Consumers submit their request to the broker who then acts as a negotiator among the participants. “. . . they [brokers] mediate between consumers and providers by buying capacity from the provider and sub-leasing these to the consumers.” (Buyya et al., 2008, p.7). Consumers, service providers and brokers are bound by service-level agreements. These agreements cover every point agreed upon among the participants: type of service to be provided, QoS requirements, payments, deadline, penalties for violation and every detail pertaining to the execution of the service request. Linking of distinct clouds in this way gives flexibility and wider choice to the consumers. They can choose the most optimum service provider for their special needs. Service providers can achieve competent capacity planning. Price-setting mechanism allows service providers to adjust the price of their services according to the market conditions, user demand, and level of consumption of the resource. Service providers can join auctions or negotiate with brokers through an admission-control mechanism. Global cloud exchange is still in the budding stages and needs global standards and reforms to allow service providers and consumers exploit its full potential (Buyya et al., 2008).
Business Benefits of Cloud Architectures
- No start up costs– Start up costs for major applications may be very high due to the need of commercial land, hardware (racks, machines, routers, backup power supplies), hardware management (power management, cooling), and operations personnel. Cloud computing eliminates the overhead costs of a project.
- Efficient resource utilization – Situations when systems fall short of resources due to the overload of requirements or when resources are free and left unused due to the lack of work are a cause of concern for the companies. Cloud computing allows efficient resource management as applications can request resources as and when required.
- Charge for utility – Service providers set up huge infrastructure but users have to pay only for the resources and services they utilise and for the time they utilise. There is no need to spend money for setting private infrastructure in the company.
- Parallel processing – Cloud Architecture allows parallel processing of multiple service requests and can speed up the processing to a great extent. A single processor may take numerous hours to complete the job whereas cloud computing can execute the same job in few minutes.
(“Amazon Web Services Developer Community : Building GrepTheWeb in the Cloud, Part 1: Cloud Architectures” 2008).
Changes in Traditional Computing after Cloud Computing
Cloud computing will bring in several changes in the IT industry. Enterprise software vendors will lose their developers and system integrators to the cloud computing platforms that will enable cheaper application deployment, substantial returns on the investments and quicker repayment of development costs. IT owners and regulators will prefer the security mechanism of cloud computing service providers to escape the unpredictability of on-premise IT. Consumers will be attracted to this new approach of low cost and low hassle computing platform. Free and open-source operating systems and applications will run on simple and light client machines relieving consumer headaches. Cloud computing will open new employment opportunities for developers from countries like India and China on their homelands. US and Europe IT markets will not be able to exploit the cheap labour of these developing countries. Development of new technologies and products will acquire Web speed. In addition to the existing companies like Amazon, Google, CA, Microsoft and IBM investing in and developing cloud computing several other software giants like VMWare, Citrix, Sun, HP, Cisco, Intuit, Symantec, Yahoo will also follow the trend. Heroku, Force.com, Morph Labs, Bungee Connect the GigaSpaces Cloud Framework will draw attention of the developers who will prefer web-based development and deployment platforms. Application servers such as GigaSpaces XAP and Appistry will replace traditional J2EE application servers. Rapid and remarkable development will mark the fields of system administration, configuration and network management. Automatic system administration will replace long-established complex system administration. The Software-as-a-Software, Infrastructure-as-a-Service and Platform-as-a-Service business will get a boost from cloud computing. Development of DIY Content Distribution and Media Hosting with cloud computing will mean many multimedia applications. Cloud-hosting will become much cheaper than the traditional hosting. Cloud computing is set to revolutionize the internet and software industry as we see it today (Coffee, Perry, Klems, Williamson, Cohen, Rushlo, Subramanian, 2008; cited in Geelan, 2008).
Future of Cloud Computing
Year 2008 has seen emergence of new market players in cloud computing. Before March 2008, very little was known and discussed about the concept of Cloud Computing. Now, there is change in prospective, as with many leading companies of the world trying to put their mark in this industry, this could well be the concept of lowering the expenditure on the software, database licensing, maintenance costs. And this cost could be dramatic in case of big industries where a high part of expenditure is spent on this. Cloud computing is an extension to “pay-for-what-you-use, use-only-what-you-need model”. This concept works on the model of on-demand usage. Companies like Amazon and GoGrid started into this business by providing Windows Server 2003. Microsoft will launch their applications under their own cloud in the year 2009. Google is extending its pack of applications and utilities by provisioning new set of services for database management. Apart from the service providers of Cloud computing, there is requirement of Cloud aggregators that provides services around integrated Clouds. Cloud aggregators are required as there are number of proprietary, languages and platforms to deal, so a single point application might not be feasible. These aggregators will be required to provide interfacing or the layering for Cloud Computing to reach the consumer. Cloud computing may not suite the whole IT industry as there will be situations and requirements where local management of databases and software is required. So hybrid partnership may also be the solution where some of the applications are managed at the cloud and others at the local point. The term “Cloudbursting” is used to describe this hybrid scenario. The big players will play a key role in the development of this market e.g., Amazon, Microsoft and Google. IBM has also given out certification on Cloud Computing to standardise the concept named as “Resilient Cloud Validation”. Without the participation of big companies, there will be non-standard clouds available along with standard clouds that may follow suites and protocols. There is another layer that resides at system level (molecular), which will be part of the development. Intel is working on to provide CPU that can augment the use of Cloud Computing and allows control on built-on switches at chip level. GoGrid which is part of Cloud Computing has already started working on to provide switches at chip level to increase the throughput of processors. Consider the example of local number portability, the concept that helped in reforming the services of telcos during 1990’s. Cloud computing and its portability might work out the same way in coming years and may prove to be the future of IT industry. Chart 1 shows the falling US economy as red and rising popularity of cloud computing as blue (Sheehan, 2008).
Chart 1 (Sheehan, 2008).
Security of the Cloud
Cloud computing can be considered as the reformation of the concept of outsourcing. In outsourcing, business owners owe the responsibility of information. Security in cloud computing uses the concept of utility, grid and Software as a Service (SaaS). Cloud computing allows information on demand which will be abstracted at the data centres provide by the service providers. Access of data, its integrity, confidentiality and secure access will be the key factors in deciding the data and environment of cloud. It can be compared to the environment of an office for which data is shared with another client and primary responsibility lies with the provider to know his/her domain and also customer’s access domain (checkpoints or access terminals on client side). Security credentials of the service provider should adhere to the risk analysis and assessment. Customer’s premises also require some kind of credential and security check to ensure secure environment of cloud computing. Current security should be studied in details to know the loopholes and also to embark the presence of new security features that will be required to set up the environment of the cloud. Information or data that needs to be put on the cloud can be classified on the basis of criticality. Security for the information classified under Mission critical information is not safe to share on cloud. Criticality of the information can decide the security steps, risk assessment measures and other security related issues that need to be considered before putting the information on the cloud. In order to move to the next level of business expansion, sharing of data on cloud will be the next vertical step (Moss, 2008).
What On-Line Users Think of Cloud Computing
69% of Americans use cloud computing being on-line. This is the result of a recent study by the Pew Internet and American Life Project. In simpler terms cloud computing will allow users to store data like documents, videos, photos and applications like Microsoft Office, Adobe Photoshop on clouds. Most people do not know that using electronic mails like Yahoo mail, Gmail and Hotmail, on-line photo services like Picasa and Flickr, or a video-posting sites like Youtube and Dailymotion is actually using cloud computing. Around 56% of computer users use electronic mails. Storing personal photos or videos on-line is popular among nearly 40% of users and nearly 30% use on-line application software, like Google Documents as shown in Fig. a . “Convenience and versatility” are the two properties of cloud computing which make it a success for online computer users and provide significant prospects for the future of computer services. (Kwik, 2008).
Online data storage with cloud computing does not pose any limits on the data storage for the users and there is no need to carry physical devices like CDs, DVDs, pen drives. Data can be accessed from any mobile device from anywhere anytime. It makes sharing of data extremely simple and trouble free. The only concern of the users is that companies can use their stored data for marketing and other purposes (Kwik, 2008).
Fig. a (Horrigan, 2008).
Fig. b (Horrigan, 2008).
As shown in the Fig. b various qualities of cloud computing lure users. 51% of users find it easy and convenient. 41% like cloud computing as they can access their data stored from any computer or mobile phone from anywhere and anytime. 39% find it easier to share data and information with others on the internet and 34% feel relieved of the fact that their data will not get lost even their personal computer crashes. Cloud computing is more popular among youngsters as shown in Fig. c (Horrigan, 2008).
Fig. c (Horrigan, 2008).
Fig. d (Horrigan, 2008).
Users are worried that service providers storing their data might use it for some unwanted purposes as shown in Fig. d. 90% users are concerned that companies might sell their data to third parties. 80% feel that service providers can use their personal photos for marketing and other purposes. 68% of the users are bothered by the fact that companies might analyse their online behaviour and display targeted ads. 63% users fear that companies may keep a copy of their documents even after users delete them and 49% are fearful about the fact that law enforcement agencies can access this data from the service providers (Horrigan, 2008).
Limitations of the Cloud Computing
- Cloud computing makes it difficult for different users to configure programs.
- Scalability with sequential processing is an issue of concern.
- Users fear a breach of privacy with online data storage.
- Internet connectivity is not rapid and robust.
- SLAs need to be more reliable.
- If the usage of services is excessive then costs of using services from a service provider will be much more than conventional hosting. (Vikrant, 2008).
Microsoft Cloud Computing
When a new concept at such high scale emerges then Microsoft cannot be left out. Microsoft has also jumped into Cloud Computing and is seen as the most prolific player in this race. It will launch its cloud in the year 2009. Using cloud computing, it will provide free software that can be used as Internet services from its cloud. This will help Microsoft to reach out to billions of its customers and provide them with the software applications which were available in form of packaged software before this. Brian Hall, general manager of Windows Live products for Microsoft, said, “We take the components of communication and sharing and create a set of services that become what we believe to be the one suite of services and applications for personal and community use across the PC, Web and phone. (Markoff, 2007). The windows live services includes a suite of different applications of Microsoft. The various services available are Windows Live Messenger, Windows Live Photo Gallery, Windows Live Mail, Windows Live OneCare Family Safety and others. Many users use Web to store their data in form of e-mails, photos gallery sites, networking sites and many more. Microsoft is excelling to become the single manager to store all the data of the users in whatever form it is. SkyDrive was launched by Microsoft recently. It is an online service of data storage where users are given storage space of 500MB on the Internet. Microsoft followed the approach of desktop dominance services to take lead from Netscape in 90’s by providing Explorer as part of desktop applications. Currently many of the services like search engines, networking sites are available on Internet only. While Google is all set to provide the main services of Microsoft that are Excel and Word on Internet Microsoft aims at providing seamless services which are provided at Internet and will be an advantage to the company with billions of users on Windows software. Microsoft is working on the strategy to provide software plus services and follow the business of on-demand where every application can be obtained in package or individual component. Microsoft is also eyeing on customers on networking sites by providing photo sharing tool, electronic mail system and Web log writing tools. It is planning to provide a tool for multimedia contents to compete with Adobe’s Flash media player in the upcoming releases. With these numbers of software and online space, Microsoft seems to be set to go for Cloud Computing. Microsoft will be the key player in Cloud computing and may lead in its domain of providing services although it has started late (Markoff, 2007).
Improved technologies, standard interfaces and protocols will open a world of opportunities for both the service providers and the users of the cloud computing. Cloud computing still has several limitations pertaining to globalisation, legal borders and confidentiality of data. Youngsters are adapting to and making use of cloud computing for data storage and data sharing in form of documents, online video posting and photo posting. The flexibility and convenience of cloud computing makes it an attractive new prospect. Consumers can access their information and data anywhere, from any mobile device ad anytime. However the major concern of the consumers is that whether their data is safe with the service provides who can use it for marketing, targeted ads or sell to third parties. Cloud computing will allow individual users and business to carry out complex operations with ease, at very high speed and paying for only the time and amount of usage.
- Amazon Web Services Developer Community : Building GrepTheWeb in the Cloud, Part 1: Cloud Architectures (2008) [online]. Available from: https://developer.amazonwebservices.com/connect/entry.jspa?externalID=1632&categoryID=100 [Accessed 16:12:08]
- Buyya, R., Yeo, C.S. and Venugopal, S. (2008). Market-Oriented Cloud Computing: Vision, Hype, and Reality for Delivering IT Services as Computing Utilities [online]. Australia: The University of Melbourne. Available from: https://arxiv.org/pdf/0808.3558 [Accessed 16:12:08]
- Coffee, P.,Perry, G., Klems, M., Williamson, A., Cohen, R., Rushlo, B., Subramanian, K. (2008); cited in Geelan, J. (2008). The Future of Cloud Computing [online]. Available from: https://web2.sys-con.com/node/771947 [Accessed 16:12:08]
- Horrigan, J.B. (2008). Cloud Computing Activities [online]. Available from: https://www.pewinternet.org/pdfs/PIP_Cloud.Memo.pdf [Accessed 16:12:08]
- Kwik, P. (2008). The Rise and Rise of Cloud Computing [online]. Available from: https://web2.sys-con.com/node/674482 [Accessed 16:12:08]
- Markoff, J. (2007). Software via the Internet – Microsoft in Cloud; Computing – NYTimes.com. [online]. Available from: https://www.nytimes.com/2007/09/03/technology/03cloud.html [Accessed 16:12:08]
- Moss, H. (2008). Technology News: Security: Putting Your Trust in the Cloud [online]. Available from: https://www.technewsworld.com/story/64761.html [Accessed 16:12:08]
- Sheehan, M. (2008). The Past, Present and Future of The Cloud [online]. Available from: https://web2.sys-con.com/node/767779 [Accessed 16:12:08]
- Vikrant (2008). Clearing The Clouds Over Cloud Computing | B.E.T.A. Daily [online]. Available from: https://www.betadaily.com/2008/04/14/clearing-the-clouds-over-cloud-computing/ [Accessed 16:12:08]