Follow BigDATAwire:

February 8, 2022

Avoid the Hidden Challenges of Data Migration

Derek Swanson

(sdecoret/Shutterstock)

To grow in today’s digital world, many companies are adopting a cloud computing strategy. Driven by core objectives like rapidly expanding customer reach and improving IT spend efficiency, pressure from the C-suite to adopt cloud services is undoubtedly catalyzed also by the reported $1 trillion in business value that it promises to unlock. Given the current state and pace of cloud adoption (smaller, non-critical workloads are moving faster than larger, mission-critical workloads), it’s no surprise that migrating enterprise class workloads to the cloud remained a top priority for companies in 2021 for the fifth year in a row. In addition, experts forecast cloud infrastructure spending will surpass $200 billion this year.

With directives from the top, many organizations are adopting a “cloud-first” strategy, meaning all new platforms and systems must first be considered to deploy on a cloud architecture or service, and an (sometimes overly) aggressive schedule is put in place to migrate all systems to the cloud.

While a cloud-first strategy sounds appealing, its execution is difficult. When you dive deep into architecture requirements, not all workloads are easily suited for a simple move to the public cloud. One example is a legacy application that lives on an operating system that does not have an adequate cloud alternative. Applications needing fast performance—high throughput and low latency—also have unique considerations that cloud native architectures and solutions have difficulty addressing efficiently (or at all). The back end of these applications include mission-critical databases where sporadic performance or failure may prove catastrophic to the user experience, impacting the flow of services or causing outages that can disrupt the flow of business. As a reminder, slow is the new down, so being slow is just as unacceptable as being offline.

The Grass Isn’t Always Greener

Migration struggles are not uncommon: the Cloud Security Alliance finds that 90% of organizations suffered from failed or disrupted data migration projects, primarily due to the complexity of moving from on-premises architectures to cloud native environments. Even when companies are successful in migrating their databases and applications to the cloud, they often find that the cloud grass is not greener.

Moving data to the cloud brings its own set of risks (posteriori/Shutterstock)

Because cloud environments are underpinned by a shared virtualized hardware architecture, hyperscalar providers put caps on the speed and flow of data to ensure that there is enough to go around for everyone. Because of these quality of service-based design limits—built to accommodate the masses and the most common workloads—big mission-critical workloads that use Oracle or Microsoft SQL Server may not be able to achieve the high levels of performance required to be useful on the cloud. Even by shelling out more money to pay for higher levels of performance, cloud customers will still only get a bit more speed before they max out.

The cloud certainly provides a lot of benefits compared to legacy architectures, but if your most important workloads are now sluggish compared to how they were on-prem, the benefits are negated. It’s why some companies have even resorted to repatriation – moving their workloads that are on the cloud either to a private cloud or back to on-premises datacenters.

According to 451 Research, 6% of organizations currently using IaaS/PaaS public cloud services have repatriated to either on-prem or enterprise co-location/datacenter environments with 14% planning to do so within the next year. These numbers might not seem significant as a total percentage, but when considering the class of workload involved that is being repatriated (the most important business-critical applications) and the significant time, effort, cost, and risk put into the initial cloud migration, it’s easy to understand the deep frustration of companies that decide to repatriate and migrate their workloads again. It’s a bad time for everyone involved, including the cloud providers, who may have made promises they could not deliver on.

Avoid Repatriation and Make the Most Out of the Cloud

The cloud is here to stay and a transformative key to business innovation. Whether you have already begun your cloud journey or have a corporate mandate to move all mission-critical workloads to the cloud, finding a way to make the public cloud work for you is key.

By introducing a high-performance data platform into your design, you can significantly de-risk migration by enabling robust and easy data mobility, while also supporting the big workloads that need fast performance, consistent stability, and guaranteed availability. A data platform sits between the customer’s workloads and the underlying cloud infrastructure. Its purpose is specifically to deliver many times faster performance and higher availability than cloud-native alone. The value to customers who are hosting their largest and most complex workloads on the cloud is that their cloud provider plus a data platform can now handle these difficult workloads with ease.

To grow in today’s landscape, the question of cloud migration is no longer if, but when and how, and avoiding moving mission-critical applications is not a viable option. Ultimately, detailed planning is required for migrating mission-critical data and applications to the cloud, and success entails an air-tight strategy and the right environment, regardless of the workload.

About the author: Derek Swanson is the CTO of Silk (formerly Kaminario), where he is responsible for guiding the customer-facing architect teams, developing product roadmap, and is the primary technical evangelist in the organization. Derek has more than 25 years of experience as a technology evangelist, systems architect, and data systems engineer. Prior to Silk, Derek has had a successful career in architecting, deploying, and operating enterprise class network, comput, and storage solutions in dozens of datacenters. Derek holds a Bachelors in Political Science and Government from Brigham Young University, with an emphasis on classical philosophy.

Related Items:

2021 Predictions from the Edge and IoT

Cloud Getting Expensive? That’s By Design, But Don’t Blame the Clouds

Cloud Storage: A Brave New World

BigDATAwire