Future Trends in System Consolidation
Consolidation of the system is the act where companies or institutions with different systems merge them into one. System consolidation takes place in different forms like a company with offices in the USA and others in Australia may be using different Enterprise Resource Planning software (ERPs) in the different offices. The organization that choose to combine these applications into one company resource management system that will be used in the various offices. Consolidation occurs not only in the software but also in the hardware resources. The organization could choose to use their servers ‘ virtualization and remove the various distributed servers. Because of its advantages and positive technological changes, consolidation of the system continues to be used. Some of these improvements are like enhancing hardware system processing power.
Reduction in Cost of Doing Business
In each of their offices, organizations had traditionally had to buy a system. Some of these systems did not consider how they would be integrated into other companies for the same purpose. So if need arose in other departments the businesses purchases a new system for that department leading to higher cost. Organizations automate their different services to reduce cost. Automation will ensure that the products of the organization are consistent and presentable.
Reduction in Redundancy in Hardware
Redundancy has resulted in a lot of hardware devices that have not been fully utilized. Installation of a system required different hardware devices to be purchased to cater for its requirement. A situation is like where a business has a payroll system developed for its offices in United Kingdom and another one for its offices in United States. If installation of this payroll is done locally in the different offices there would be a need of purchase of servers for the different offices and network equipments to be used in the different offices. This creates redundancy by having more than one server which would be replaced by one central server for all the offices if system consolidation was carried out.
Easiness in Decision Making
Companies are trying to make decisions from data they collect over time as Waijter noted (businesses today have data mining systems) to enhance decision making. This data can be sale, purchases and wages. Traditionally each of this data was stored in different servers because of the difference in the systems acquisition and usage. The decision-maker will first have to attempt to merge the information from the different servers. This consolidation adds some more trouble. System consolidation comes into the place where one server saves all the data. The model builder will have data from a central location readily available. As Mathias notes (The model builder must define the problem, simulate the model, evaluate the model and implement the model) which increases the complexity of the program. Implementation is also made easier as the process must incorporate the template into the main system. It reduces costs as there is no need to adapt templates for use in different office locations. The building of models adds complexity (modeling for relevance must be considered) due to need of ensuring relevance.
Software Construction Complexity Increases
The changes that are coming with system consolidation are leading to a need of increase in complexity of software. Some of these changes are making of distributed systems to be used by a company and improving on the network to cater for the distributed system. This requires the software engineer to take into consideration factors like synchronization of activities carried out within the software. A case scenario is like the banking system. This system stores data in different distributed databases which if a client carries an updates in one location it should be reflected in all other databases. In a situation may be where a client withdraws money from one of its branches and the system stores information in the local servers without updating the other servers. If programmers made a mistake and did not include a functionality of updating the other servers to the application, the client if he or she moves to another location will find out that the account balance reflected is not correct. If the client is malicious, they will go ahead and make another withdrawal which would be a loss to the bank. Ralph states (increased robustness of program due to the problem scaling) which eventually leads to program complexity.
Programmers have not only the functionality of the system to think about but the timeliness with which the system will deliver its functions. The mistake in the banking system would have been due to delay in updating the other servers thus leading to all those errors.
Cost of Software is Increasing
Manufactures are making hardware with high capability as noted by Verbeek (hardware cost is increasing while software cost is reducing) which has led to system developers to develop systems carrying out parallel computation. This improvement has also allowed for system consolidation because machines can support multiple activities at the same time and also store a huge amount of data. All this factors leads to the need of developing systems that serves all the purposes within the organization from the same point. To develop these systems it requires modern technology and tools for engineers. Companies with these abilities are few in the market making them to charge high prices for their product. Prices are also high because of the risky nature of the software the developer would like to insure them against any loss that may occur in the future. The company might decide to sue the developers for this loss which the developers will be required to pay damages to the company. To mitigate these losses developing companies takes insurance covers of which the developers pass the cost indirectly to the client.
Adoption of Virtualization in the Server Space
System consolidation is leading to a need of establishment of data centre and servers with huge server space. Server virtualization works by the administrator acquiring specially designed software and using it to convert one physical server to multiple servers. These virtual servers will act like stand alone machines each with its own operating system installed. Server virtualization has some advantages on the organization. One of the benefits is that the company will just require purchasing one server with higher processing capability. The administrator subdivides the server into virtual servers which will be able to utilize the server’s processing power. Networking cost is reduced. This happens by having just to network the work stations from one server but not from different servers located in different locations .Server virtualization comes with its own limitation which slowing down of applications. This happens when a server is running multiple applications which require higher processing power. Due to the fact that the server can’t support all of them concurrently they will eventually slow down. To avoid such situation the server is not virtualized a till exhaustion of its computing power. This will cater for concurrency computation that carried out by the different applications running on the server.
Another limitation is the problem when need arise in migration of a virtual server from one physical machine to another which led to Alan stating (virtualization without power is nothing). The migration is only possible if all the machines are running on the same manufacturer’s processor.
A case scenario would be if one server is running on Intel processor and the other one is running on an AMD processor. The network administrator might require moving a virtual server from the Intel based processor machine to the AMD based processor server which is not currently supported.
Despite all this limitations companies are still migrating to server virtualization. This is so because of the increasing technology of server virtualization which is reducing power consumption and leading to fully utilization of servers.
Adoption of Cloud Computing
System consolidation is leading to need of cost reduction on companies while enjoying the most recent services. Cloud computing comes to offer these services where companies are not supposed to buy some information technology infrastructures to deploy their services. Cloud computing resembles virtualization of servers but this time servers are not within the organization. The utility computing of cloud allows firms to buy virtual server spaces which they can access on demand
Cloud computing has also enabled companies to subscribe to online applications which they use over the browser. The companies pay for the applications as per the usage rate.
A company with higher usage will pay more than a company with less usage. Such applications will eliminate redundancy in applications where companies are holding applications that they do not need. Cloud computing has a benefit on reducing cost on the business fraternity. Cost reduction mostly is on the maintenance part of the application. The fact that the application is run by a third party on your behalf they also incurs the cost and trouble of maintaining it on your behalf. Companies share an application that does the same functionalities. Such application might be like the point of sale system which the cloud provides to the client for usage. This sharing of the application reduces the cost of acquiring the usage of that application. In future cloud computing will roll out internet integration where there will be integration of the different cloud base services. This will allow people who want to enjoy the services of cloud computing have a one connection to the cloud eliminating the need of subscribing to multiple clouds. Cloud computing has tried to enhance security to encourage different user to adopt .Some of the security measures that companies have put into place are categorization into sensitive data access to multi tenancy issues. People have the fear of malicious workers and people altering their data in the cloud. Companies are seeing cloud computing as one of the solution to reduce the cost of running businesses. Companies that are providing the cloud services are like Microsoft, Google and IBM. Google provides some of these applications through its chrome browser where a company is supposed to have an internet connection to enjoy these services. Cloud computing is seen to take the future of computing as noted by Stetan (Our beloved cloud computing market showed significant signs of maturing)
Reducing the Number Operating Systems
Adaption of the system consolidation has led to institutions to deploy the same operating system in their machines. Traditionally, while each office had its own system running within its own environment the offices installed server operating system that would support that application. This eventually led to the organization having multiple operating systems in the different offices like Windows, UNIX and Linux operating system. This operating system got different needs in the form of maintenance and operation. Medhi noted different operating systems (Each operating system requires an administrator)9 increased the complexity of data transfer in the systems. These needs led to companies to hire different administrators to carry out the different administrative role. The encouragement of system consolidation has made it possible now to share the workforce between the different offices due to uniformity of the operating systems and applications. Using the same operating system is also beneficial in that companies can purchases licenses in a huge number thus enjoying the economies of scale in purchases. Companies will buy a license that is stored in the server and all machines in the domain can be activated automatically.
Adoption of Grid Computing
This is a form of computing where independent hardware and software components are pooled together and then provided to users on demand. Grid computing mostly involves virtualization of computing power, storage power and network capacity which is provided when needed. Grid computing adapt to the working environment needs in that it analyzes the workload and demand of resources so that it can adjust the supply. Grid computing has brought with it standardization of the information technology infrastructure like using the same operating system. It has also adapted the virtualization of IT resources eliminating redundancy in usage of the different resources. Virtualization in grid computing makes sure an application is not tied to a certain server or information technology resource. The automation part of grid computing is beneficial in organizations in that they do not require hiring extra staff to operate this environment.
Grid environment has been adopted in modernizing the data centers by employing a number of its techniques which are:
- Grid computing has the ability to consolidate the different resources in the data center.
- Grid computing employ Agile IT operations
- The scalability nature of Grid computing by being able to increase or reduce supply of resources as the need arises enables it to be used in data centers.
- Grid computing offers a characteristic of continuous availability of resources making the data centre be available to the clients all the time.
Due to the beneficial nature of grid computing enterprises are adopting it as a choice. Green computing and energy saving has been enhanced by grid computing by its nature of reducing the power consumption by servers in the data centers.
Grid computing has also taken into consideration disaster protection. Data centers cannot survive complete failure due to natural disasters like lightening and floods making it good for them to adapt grid computing as Ian states (grid computing carries dynamic shifting of workloads from one centre) to avoid this loss. Enterprise can prevent this from happening by adopting grid computing and also making sure there is dynamic shifting of workloads from one center to another.
Automation of Most Services
System consolidation has led to automation of most services delivery and also activities. As the complexity of some systems is increasing more functionalities arises with this. Some of the automated services are like resources allocation in processing of certain data. Intelligent networks are being installed in institutions to be able to redirect the bandwidth to the part of the offices the net is required more. These networks are able to learn and come up with the pattern of usage of the network like when a certain department sends huge chunks of data to the servers and when its dormant. Learning can be done using such models as neural networks as noted by Adley (neural networks analyze data to get optimal solution) after this learning utilization of the network will be efficient due to the fact that where the net is not required it will not be supplied. Another form of automation comes in place in data backup. Backup within organizations occurs at a certain frequency and sometime workers can forget. To avoid such situations the system is set to carry out the back up automatically on its own after a period of time. Automation has an advantage of reducing the man power required within the organization to carry out some of these activities and also energy conservation (wireless networks have been improved to save on energy consumption).
Remote Location Data Backup Being Adopted
Organizations are starting to realize the need of having an extra copy of their data stored in a remote location. Remote location data backup involves a company hiring server space from another company. This company will then be charged with the responsibility of making sure the data is safe. In case of an accident in the company’s offices and the data is lost they are supposed to reconstruct their data from this back up centers. Data backup centers charge a fee to the companies they offer these services with the amount of server space required. One of the main a challenges if there is infiltration by malicious people to the data backup centers. This would lead to loss of confidentiality and integrity of data held by many institutions. Fear has also been raised that governments can use data centers to access personal information from data centers. Some of this data would be about financial records of a company believed to be involved in some illegal activities. All this is preventing companies to use data centers that are government sponsored instead they prefer to establish their own centers even if the costs are high. The law has also come into place to prevent such things as the government accessing these centers without following the due legal process. This legal process is that a court order has to be obtained before access to person personal information from this place.
Reduction the Number of Vendors Purchases are Made from
Companies are starting to learn the benefits of using resources from the same vendors. This is brought about by the benefits of after sale services provided and also compatibility. Compatibility comes up mostly in place in server virtualization and migration. A virtualized server can only be migrated in a homogenous processor platform rather than in (heterogeneous environment). To avoid a situation where you cannot transfer your data within your own equipments there is a need to purchase equipments from the same supplier. Purchasing from the same vendors has an advantage in that it introduces a relationship between the company and the vendor allowing the vendor to give maintenance services at a lower price. The vendor might also offer to carry an upgrade of the company’s hardware resources at a lower cost than the market value due to this mutual relationship. Purchasing from the same vendor has an advantage that a business can request for a custom made device which will be compatible with the other existing devices because they were supplied by the same supplier.
Data Sharing is Encouraged
Institutions are now sharing data but with some constraints attached to it. An institution may request a bank on information about the credit worthiness of a client. This will help the institution on making decisions whether to lead the person some money or not. Also an employment agent may be allowed to access the database of all education institutions.
The employment agency will be limited in the amount of data it can mine from the institution database. It can only confirm if a certain person was a student at that institution and if they graduated. This form of data sharing makes a positive use of the data held within different organizations. Data sharing has also been criticized in the sense that it can fall in wrong hands leading to loss of confidentiality.
Data sharing is also being used in scientific research and surveys where a pool of data is provided to be used as noted by Van (the power of research in the future is sharing of information among organization).
More Research Pertaining Security of the System
Individual offices or individual locations of where the system was implemented had to provide security on the system. So if an institution does not have a uniform security policy, some offices might leave the system prone to attack. System consolidation eliminates this by making sure all information is stored in one central location where total security can be provided. The company can now introduce an access control list within it system as a whole(access control restrict what each individual can access from the servers). It also becomes cheaper for the company to provide security. System consolidation also has a risk of being prone to attack by malicious people. This comes into place if the company had consolidated its servers making its data center to be located in one place. In case of natural disasters the institutions might lose all its data. To evade this remote data backup centers are being built to hold this data on behalf of clients.
The Shrinking Private Data Centre for Companies
Companies are shrinking their private data centre due to the introduction of the distributed system and cloud computing. This has also been brought about by the fact that most of these companies purchase their application to companies that offer their services online thus offering the services of hosting their applications. Having a storage space outside the organization is perceived to be cheaper than having one within the organization. Having the data center in organization increases the overhead cost of security.
More Businesses are Absorbing Information Technology
As system consolidation is taking a toll in most companies, it is enabling these companies to shun away the traditional centralized IT department. This is happening due to the perceived benefits businesses are seeing in system consolidation. Businesses are observing that with system consolidation cost will be cut at some percentage and efficiency encouraged. Mergers between different businesses are also forcing them to study the systems that exist in both businesses (study reveals those similar activities that can be carried out by one system). After a study is conducted a system that can perform all the activities in the different locations is developed to eliminate redundancy. The main aim of developing the system is to introduce a central point of control of the different activities of the businesses.
Introduction of Expert Systems in Organization
Experts system makes decisions by reasoning on certain situations. System consolidation is making it easy for the implementation of expert systems by providing the required environment. Expert system is supposed to apply their capability through backward and forward chaining to acquire knowledge from the business environment. Almost every organization in the future will adopt expert systems due to their power in decision making. Cohn states that (artificial intelligence systems will assist managers in making sales forecasts and also may be production decisions).
- Adey R.”Advances in Engineering Software.” Generalize regression neural network in modeling 2, no.37 (2006): 406-408.
- Akyildiz I. “Ad Hoc Networks.” Energy conservation in wireless sensor networks 7, no.3 (2009): 537-568.
- Alan Maddison .“The Architecture Journal.” Virtualization without Control Power Is Nothing 20, no. 4(2011): 5-6.
- Arnold L. “International Journal of Foundations of Computer Science.” Heterogeneity in computing 1045, no.101142 (2011):1471-1493.
- Cohn, A. “Artificial Intelligence.” Argumentation in artificial intelligence 2, no.170 (2007): 619- 641.
- Eugene H. “Computer and Security.” Towards a location-based mandatory access control model 26, no. 1(2006): 36-44.
- Felleisen Matthias. “Journal of Functional Programming.” Complexity in Programming 45, no. 20 (2010): 30-32.
- Gelenbe.”The Computer Journal.” The distributed Systems 345, No. 2144(2010):5-6.
- Ian Foster. “The journal of Grid Computing. “Grid computing future 9, no. 10723(2011):5-7.
- Matthias Felleisen. “Journal of Functional Programming.” Complexity in Programming 45, no. 20 (2010): 30-32.
- Mahanti Prabbat. “Journal of Computers.” Operating Systems 5, no. 1796 (2009):20-21.
- Medhi Deep. “Journal Of Network and Systems Management.” Control of distributed Systems and communication networks for voice, data, and networked computing 1064, no. 10922 (2010): 50-60.
- Meyer A. “Information and Computation.” Logistics of communication and change 204, no.11 (2006):1620-1662.
- Staff C. “Computer world.” Reshuffle at Prime Targets Integrated Systems 7, no. 10 (2011):51.5.
- Staten. “HP Virtualization Journal.” Taking the Pulse of Cloud Computing, fall 2011 15, no.12 (2011): 1.
- Ralph D. “Mathematical Programming.” On the implementation of an interior-point filter line-search algorithm for large scale nonlinear programming 24, no.20 (2011): 20-22.
- Verbeek H. “International Journal of Engineering.” Computer Systems 45, no.1985 (2010):1-2.
- Weijter A. “Information Systems.” Business process mining 32, no. 5(2007): 713-732.
- Willy. “ The Journal of corporate Computing.” Computer system consolidation 5, no. 3(2010):2.