1. Introduction
Cloud migration is the process of moving local applications, data, and infrastructure to the cloud [
1]. The cloud can provide computing resources on demand without the active management of its users as required software, hardware and even network can be stored on cloud. Users may not need to make requests to the ICT helpdesk to perform certain tasks and waste time waiting for them to revert. Also, the services are made 24/7 available, users may only need to request the services that are needed, which is the On-demand self-service. Some cloud services even offer updates and upgrades of the software available automatically, so that users can easily access the services anytime on the platform [
2].
Some of the benefits and characteristics of cloud computing are mentioned below:
The On-demand self-service model of cloud computing enables provision of computing resources without going through the cloud provider. Users can use the services as long as there is an internet connection, anytime. Some applications even work offline. For instance, additional storage space and access to the database. One exception is that it is not available during the downtime for maintenance [
2]. For instance, typically for Public Clouds, service providers get to control the time and date for the system to be offline, forcing it to be on maintenance. However, users get to control the upgrade and maintenance cycle in Private Clouds. Users can monitor the systems and further prevent any outage from the systems [
3]. Despite the exception, it still benefits in the commercial world, especially for an online business platform and global company that have different time zones. They may work at any time as long as there is an internet connection, which increases the work productivity level in the company. Next, it can solve the problem of lack of collaborations between its employees and potential international partners. As potential international partners may have different time zones due to its location, communication at identical timeframes may be difficult. As for employees, especially during the Covid-19 outbreak, work from home is not something new anymore. They are able to exchange and update data like number of sales, and progress of certain development of software on cloud by just a click of hand easily [
3].
This is made possible by the broad network access of cloud computing which enables information from around the world to share information through cloud computing services networks. The cloud network bandwidth and latency enabling serving time sensitive manufacturing applications to function efficiently. This characteristic improves the mobility of data sharing such as being accessible on mobile handphones and laptops at home. Employees would be able to work at any location that suites the current trend of working from home after the Covid-19 outbreak. This is well-suited for the business that operates internationally enables the employee from different countries to exchange data. The integration of data will exist, and errors and mistakes in the data recording system could be eliminated as well [
4].
- b.
Data Security and Backup
Moreover, cloud can solve the problem of unsecured database systems comparatively with ICT infrastructure. This is because most of the databases are handled manually, possible human error might occur and even loss of data. When a hacker kicks in, the data will be lost in seconds, without a back-up protection. In a worst scenario, the hardware could not be used as well [
4]. When a user stores the data in the cloud, it will produce a copy of the data as a back-up. The back-up data is stored at another server to prevent any data loss or corruption. Service providers control and manage the data system, thus users must put on trust to the providers. This seems that the service provider owns the data stored thus, the issue of ownership will arise. This is because the service provider has full control of it, and its employees could access the data as well.
- c.
Choice of public/private data
The cloud can also solve the problem of which data to be stored in private or open to public. In the ICT infrastructure, it is generally either used by the Private institution or the public institution. With each institution, the data store can be viewed by the employee and accessible to anyone as long as they have the identification number of a particular matter. It is unable to have a hybrid storage device which stores the information privately and another information publicly, in other words, the data stored by the user in the ICT infrastructure are still identifiable, personally.
With Multi-tenancy characteristic of cloud, it enables users to share the same applications without revealing their identities. This means that a single program is provided to serve different user groups. This is just like a tenant in a condominium sharing the same building. This is seen in a Community Cloud, where a group of users that share the common goal can share the cost and expertise between them. Users can choose which data or information to be stored privately or publicly and who can have access over the data stored.
- d.
Optimized Resource sharing and allocation
The amount of CPU and memory used by the two departments varied. The resource pool may be utilised to fix the problem, and the management can set one party's CPU share to high and the other party's CPU share to normal, allowing department users to execute automated tests. And the second resource pool, which has fewer CPU and memory resources, is sufficient to fulfil the marketers' reduced burden. When the quality assurance department does not fully utilise its given resources, another department may use them. Management can adjust resource allocation or add, delete, or rearrange resource pools as required. The resource pools are separated from one another and share resources. The change in resource allocation within the resource pool has no unintended consequences for any unconnected resource pools [
5]. Moreover, SLA (Service Level Agreements) facility offered by cloud services providers ensures the Quality of Service (QoS)[
6].
- e.
Meeting changing business needs
The goal of cloud computing is to reap the benefits of distributed resources while also solving the problem of large-scale computational overhead according to changing business needs[
7].
Seeking more computer resources to meet the rise in demand as soon as possible can be tackled with rapid elasticity, which allows users to adjust resources up or down at any moment, removing the need to maintain extra infrastructure on hand to accommodate dynamic workload surges. Cloud companies are seen to be more flexible if they can swiftly modify resources to meet your changing needs. Rapid flexibility also aids in avoiding under- and over-provisioning of computer resources. Over-provisioning is purchasing greater capacity than you require. Under-provisioning, on the other hand, happens when you give fewer resources than are required [
8]. Over-provisioning costs money, while being under can cause problems. Both problems cause downtime, which can reduce income and customer satisfaction. Rapid elasticity in cloud services minimises the chance of dealing with such difficulties for short-term resource demands such as web traffic spikes and databases backups [
9].
When an enterprise firm expands and requires more storage space, computing power, bandwidth, and current user accounts, metered call service can help. By implementing an abstract level metering function appropriate for service type, the cloud system automatically regulates and optimises resource utilisation.
Its delivery style in which the utility supplier tracks how many services each customer consumes in a certain time. Measured services offer providers with information on resource use and provide clients with a clear grasp of the invoicing mechanisms for the services they use. The service takes advantage of rising time, repeating dimensions, and smooth indication changes, picks coding and compression methods wisely, enhances data compression ratio, and saves you storage space by pre-reducing accuracy and aggregating previous data [
9].
The rest of the paper is organized as follows:
Section 2 presents the cloud migration methods, tools and techniques.
Section 3 presents cloud deployment and service models,
Section 4 gives the overview of the SDN enabled cloud computing, and
Section 5 concludes the paper.
2. Cloud Migration Methods
Online migration is the migration of systems that provide services online from one location to another, with no downtime and no disruption to services, and is transparent to users. The process is similar to encapsulating a virtual machine in a file and running the configuration and memory running state of the business environment on another physical machine over the network, maintaining the original running state all the time, and eventually migrating the source host physical host to the destination host on the cloud.
Offline migration is now more commonly used and is often referred to as regular migration, where the host must be shut down before migration, and the system state is copied to the destination host via storage or the network. The system state is then reconfigured on the target host on the cloud, and the system is powered back on. This migration method is simple and straightforward, but there must be a clear procedure for stopping the virtual machine, and it is acceptable for the user to be unable to provide business services for a period of time, so it is suitable for low business requirements.
2.1. Cloud Migration Scenarios
Companies can migrate their businesses processes to cloud migration in two ways: partial migration and overall migration.
- a.
Partial migration
Partial migration means that users divide all data into two parts, which are migrated to the cloud platform and retained locally. The local server and cloud platform cooperate and share to achieve utility.
- b.
Overall migration
Overall migration is when users move all their data and content to a single cloud platform at the same time in one step. The comparison of the two is shown in
Table 1:
2.2. Cloud Migration tools and services
This section presents some popular cloud migration tools and techniques.
PlateSpin migrate is a powerful workload portability migration tool developed by Novell, which provides secure and reliable migration capabilities between data centres. This tool is designed to meet the application requirements of users migrating their old server systems (including operating systems, application software and business data) to various server models of different manufacturers without putting the original physical server offline [
10]. It is the most versatile and best performing migration tool of its kind. A powerful migration tool to assist users with the workflow Plate spin migrate can support physical to virtual, virtual to mirror, mirror to physical and more. Also, Plate spin migrate can easily migrate physical machines to any virtual machine environment. Further, modifications can even be synchronized to maintain the virtual copy up to date while the actual copy is still in production. It automates the process of transferring physical or virtual servers to enterprise cloud platforms across the network [
11].
Figure 1 shows the automatic server workload migration over the network using PlateSpin migrate tool.
- b.
AWS Migration Services
Amazon, the industry leader in public clouds, provides a number of extremely well-linked cloud migration services. This includes the AWS Migration Hub, the Snow series of offline data transfer devices, the AWS Database Migration Service, as well as the AWS Server Migration Service. If the business is an AWS customer, most services are free, but these tools only support the AWS cloud platform. Clients can utilise AWS Inc.'s services, which are collections of actual hardware, to transmit data to its data centre. Companies are putting equipment in their own data centre, send the equipment data to AWS, and then load more equipment [
12].
Microsoft is providing a whole service called Azure Migrate, a database migration service, including its data box hardware enabling offline data transmission for clients moving workloads to Azure. Initially, it was launched as a VMware vSphere-specific migration tool but gradually transformed as "the migration tool" for every aspect of Azure to migrate any kind of workload to the Azure cloud. It supports no other cloud service providers, but like other public cloud providers, it offers most of the migration services at no cost. For Windows and other Microsoft software, these solutions are especially well suited [
13].
Figure 2 shows the overview of Azure Migrate-Hub for datacentre migration.
- c.
Google Migration Services /Velostrata
Google just purchased Velostrata, a start-up whose cloud migration skills have been much improved. The technology of Velostrata can ten-fold the rate of movement. Many of Google's migration services are free, just like those of other public cloud providers, but they exclusively support the Google Cloud Platform [
13]. As shown in Figure3 a VPN or Cloud Interconnect link between the on-premises data centre and GCP, as well as a Virtual Private Cloud (VPC) in GCP, are also needed. Velostrata Management Server monitors the migration process and can be deployed in the source data centre or to the recipient end i.e. the google cloud platform. The cloud edge components of Velostrata are established in the VPC subnets.
Figure 3.
Velostrata deployment overview [
15].
Figure 3.
Velostrata deployment overview [
15].
2.3. Cloud Migration test, implementation, and validation
With a clear migration solution and migration tool, you need to deploy the business environment and migration tool in the cloud platform environment for migration testing, implementation, and verification. Test the service migration server, test the service migration tool, Complete service data synchronization, consistency check, and data switchover. Acceptance. Evaluate the system after migration and evaluate whether the migration goal has been achieved.
According to the migration scheme, test the scheme with appropriate testing environment, follow the correctness, consistency and availability in technology and process, verify the migration scheme, refine, and improve it, and form the final migration scheme. Simulate the real situation of customers as much as possible, test the feasibility of solutions and tools in various scenarios, output test reports, communicate with customers after review, communicate again about downtime and risk points in the migration process, especially in the backward compatibility verification and rollback test, what changes need to be tested, how to change gradually, and execute necessary scripts in the migration process. In this way, test cases and scenarios can cover the entire migration process, and ultimately work with users to refine and output migration test reports that need to be modified in the migration solution [
16].
- b.
Migration implementation and verification
Including service migration, data synchronization and consistency check; Select an idle period for traffic switchover within the available service cycle. Before the system traffic switch, ensure that the system data is updated in real time, and the difference time window of the data between the source system and the target system is within the acceptable range; Choose to suspend the business of the source system, switch the target network with differential data, test and verify the function and data consistency of the business system by the business personnel, optimize the results and monitor the process problems, and sign the industry acceptance report, so that the target system meets the requirements of system migration and deployment. In particular, when migrating the Linux system, it is necessary to consider the changes of network hardware devices, the kernel does not support some virtual devices, the window needs to be reconfigured, the changes of the network environment, the changes of the CPU extension instruction set, and many other factors will affect the implementation of migration and the reliability of the verification [
17].
3. Cloud Computing Deployment and Service Models
3.1. Cloud Computing Service Models
The main service models of cloud computing are Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS), and Infrastructure-as-a-Service (IaaS). As shown in
Figure 4, IaaS is the foundation of all cloud services that provide computing, networking, storage, and database facilities through the Internet. All IaaS services are based on virtualization, and users can acquire virtual machines from the service providers instead of provisioning physical hardware. A virtual machine (VM) mimics the behavior of an actual machine by utilizing its own virtual resources such as operating system, storage, RAM, and CPU[
18].
SaaS offers cloud users comprehensive software, including email, social network services, online calendars, and pretty much any other programs that are hosted in the cloud. Customers using SaaS service model have no control over any infrastructure resources such as storage, CPU, or operating system [
20].
- b.
Platform as a Service (PaaS)
The platform resources offered by clouds can be used by application developers through PaaS, this includes different utilities and libraries. Customers using PaaS service model can fully control the deployed application without any control on other resources like operating system or storage.
- c.
Infrastructure as a Service (IaaS)
The third and most basic cloud service model is IaaS, which gives cloud users access to virtual machine servers and accompanying infrastructure. Customers can use the infrastructure for any purpose, such as establishing their own platforms or creating software that runs on virtual machines. Unlike the other two models, IaaS offers the highest level of control over some resources such as operating system storage and applications. Following are some of the benefits of IaaS[
21].
A major benefit of using IaaS would be cost savings. With IaaS, the company would not need to set up servers in every office they have. Staff outside of the main office with the data centre can just access resources such as VMs via the private cloud [
22]. Additionally, the company would not need to provide devices for each employee to use as they can perform desktop virtualization and employees can use these virtual desktops to do their work from any device with an internet connection.
- b.
Scalability
Additionally, scalability is another benefit of IaaS. IaaS would allow the company to provide resources to their employees and other users based on their current needs. This means that they can scale up or down the resources that they allocate to various departments depending on their workloads at the time [
22]. This additionally leads to better
Return on investment (ROI) as it ensures most of the resources are being utilised instead of being wasted.
- c.
Centralised management
Centralised management is a main characteristic of the IaaS service model. Since the company will be using a private IaaS solution, they only need to manage their single data centre only. This means that instead of performing maintenance and upgrading many different data centres across their different offices, they only need to maintain and upgrade one. This significantly reduces work related to datacentre management. Additionally, since employees will be accessing the company's resources via virtual machines from their personal devices, updates only need to be performed on the VMs instead of many different physical devices which also contributes to reduced management workload.
- d.
Business Continuity
Finally, IaaS would contribute to business continuity. In the event that one of the company’s offices face a power outage, employees can still continue their work on their personal devices at home. This minimises downtime, leading to less of an impact on business operations [
23]. If every office had their own servers, a power outage would result in disruptions in business operations.
3.2. Cloud Computing Deployment Models
In this context, "cloud" offers multiple deployment models involving private clouds, public clouds, community cloud or hybrid clouds as shown in
Figure 5 [
24].
The cloud provider owns, manages, and maintains the infrastructure, which is located in the public cloud on its premises. The cloud provider could be a corporate, public entity, or a mix of these. The public cloud can be used by anyone to seek the resources they require.
- b.
Private Cloud
Private cloud owns by a single entity. When using a public cloud, the user has no control over the compliance which can be a problem as
businesses may need to comply with certain laws related to data privacy. Private clouds on the other hand are under full ownership of the user organisation and can be deployed with the necessary compliance regulations[
25]. While cost may be a big drawback of the private cloud (and an advantage of public clouds), it is a necessary trade-off for stronger security and control. Considering this companies has seen a surge in business activity, they will likely be bigger targets for hackers to steal data from to sell for a high price. The private cloud will in turn help make the company more resilient to data breaches. Additionally, considering the company is already popular and is seeing a recent surge in business, they should be able to afford to implement a private cloud.
Following are some of the benefits of private cloud over other cloud deployment models.
- i.
Maintenance & System Logs
Additionally, another benefit of using a private cloud solution would be the level of control you have over it. In a public cloud, an issue that may arise is that users may not have control over various aspects of the cloud such as audit logs and maintenance.
Hence, by deploying a private cloud model businesses can have better control over their resources which allows them to perform maintenance and system upgrades whenever they deem appropriate and have full access to audit logs [
26].
- ii.
Geographical access
Geographical access is also another benefit of private clouds. Most cloud service providers who offer public cloud solutions have a set amount of availability zones around the world. While most major service providers like AWS and GCP have availability zones in most regions around the world. With a private cloud, the company can assure that their employees will always have stable connection and they can also tailor their compliance requirements to that which is required in the location they are setting up their offices[
25].
- iii.
Security & Privacy
Finally, a private cloud offers good security and privacy. In a private cloud, all the data is stored on the system of the user organisation. This would mean that the
company will have greater control over the data and the access rights. This ensures that only those within the organisation can access the data, which leads to strong data privacy and as a result reduces the risk of data leaks and unauthorised individuals from accessing critical data [
25]. In a public cloud, the data would have been stored on the servers of the cloud service provider, which would compromise data privacy.
- c.
Community Cloud
This model is intended for a certain class, category, or demographic of people with high security needs. Only members of the party or culture have access to resources.
- d.
Hybrid Cloud
A hybrid cloud combines two or more deployment models, but each remains an independent entity, linked by standard or proprietary technology that allows data and application portability. The hybrid approach enables a company to deploy less sensitive software and data to the public cloud, making use of the scalability and cost-effectiveness of the public cloud.
4. SDN-enabled cloud computing
Cloud service providers maintain several data centres spanning many geolocations with virtualized resource allocation to give dynamic service to their clients. Thousands of switches interconnect tens of thousands of servers in large data centres. Using virtualization technology, and as per available capacity each server handle numerous application requests from various users by allocating different virtual machines (VM). [
27]. In a traditional network, multiple switches and routers connect these VMs to the Internet and VM traffic is efficiently transferred via the data centre network (DCN). The DCN connects tens of thousands of servers, making it complicated and challenging to operate and scale in classical networks, as each switch has its own control mechanism depending on neighbouring switch information. Hence, in a cloud data centre with a high server population and several VMs that dynamically change, the typical network method is not suitable. To address the drawbacks of traditional networks, cloud data centres started implementing software-defined networking (SDN) into their DCNs. SDN centrally controls the network which makes it agile and flexible[
28]. The SDN controller efficiently adjusts network flow that fits the dynamic cloud service in context of automation, virtualization, and security. With centrally acquired information, the controller can make quicker choices and transmit it onto data plane switches smoothly and dynamically. For network security the controller can easily program control logic and transmit SDN control packets to switches and block the malicious traffic. Thus, networking behaviour is extremely configurable [
29]. The major cloud service providers such as Google, and AWS deploy SDN in their datacentres for better network manageability and scalability[
30].
Cloud computing plays a vital role in business growth in various sectors, including health, IoT, wireless sensor networks, AI, Sports, etc. [
31,
32,
33,
34,
35,
36]. The current era of application is completely shifting toward cloud-based solutions for better management and access [
37,
38,
39,
40,
41,
42,
43,
44,
45], which extends from smart cities to smart home applications [
46,
47,
48,
49,
50,
51,
52]. Small startup and companies can easily start their services with the help of Cloud computing solutions at a lower cost [
53,
54,
55,
56,
57,
58,
59]. It is easy to manage and access the services and provide services 24/7 [
60,
61,
62,
63,
64,
65,
66]. Initially, Cloud solutions were introduced mainly for storage purposes. However, now that it is expanding beyond that. However, migrating from on-premises to the cloud is still a huge challenge that still requires max attention.