Accelerating Your Move to the Cloud

PinIt

The transition to the cloud has become a critical strategic move, but organizations must approach this migration with careful consideration.

Cloud computing and storage have both skyrocketed. According to an IDC study, by 2025, approximately half of all data will be kept on the cloud. At the same time, the volume of data has exploded in the past decade, driven by the rapid growth of technologies like artificial intelligence (AI), machine learning (ML), and the Internet of Things (IoT). This development has also affected the SAP arena. Consequently, organizations have been switching to hyperscalers to increase their capacity to handle data growth.

More and more companies are continuing their move to the cloud. But it’s not just a matter of “lift and shift.” This migration must be done purposefully, and it requires prep work to get it right. The right approach is to get your data and systems streamlined before the move. That will result in faster migrations, lower costs overall, and reduced risk.

Three factors to consider

For organizations to achieve their ultimate goal of being on a hyperscaler, they must consider these three factors:

  • Cost ramifications: The size of your database directly affects how much your hyperscaler service will cost. These expenses, which can be very high when moving sizable applications to the cloud, for example, SAP, can affect the return on your investment.
  • Data optimization: Usually, only 20% of data in an enterprise application is regularly used. For business and regulatory purposes, the historical data must still be kept. Moving all of this data, however, adds unneeded complexity, risk, and expense.
  • Software requirements: Legacy systems frequently use outdated, unsupported software, and hyperscalers need a minimum software level to function. This puts a barrier in the way of going completely cloud-based.

Drivers of change

The ideal strategy is to streamline your data and system before the migration. As a result, migrations proceed more quickly, are less risky, and are less expensive overall. Many organizations – whether they have SAP systems or something else – have built up a set of legacy systems over the years. Previously, they actively used these systems all the time, putting data in them, but now they’ve got one or more new systems in place. However, they keep these old systems because they still need access to the data.

In the past, it’s never been the priority to do something about legacy systems, but times have changed. Now there are some real drivers to keep access to that data but no need to keep those legacy systems in place. So, decommissioning legacy systems is about helping customers retain the data they need without having to keep those old systems.

A massive migration to the cloud is driving and incentivizing this change, part of the larger revolution in IT in the last few years. Cloud is the future in most cases. There are always detractors saying it’s the wrong move, but many organizations are either already on some sort of hyperscaler platform or are planning to do so.

And equally, the move to SAP S/4HANA among the SAP community is driving change, as well. Why is this important? Because when a company is lifting or shifting a load of systems that they’ve had for many years and putting it into a cloud environment, they’re moving to Azure or AWS or Google. Now they’ve got an almost daily or hourly ticking clock of the costs of this environment in the cloud – and the cloud migration is costing customers two or three times what they expected.

See also: SAP Keynote: The Future-Proof Imperative

Three best practices for accelerating cloud migration

First, it’s going to be crucial that organizations understand that they need to involve the business. Typically, the migration is left in the hands of the IT team. But it includes multiple areas of the business – not just finance or sales and procurement. Companies may need some level of engagement with people in each of those departments who work with these systems.

Second, project leaders need to make sure people understand the real drivers – that is, why they need to keep the data. Why are departments keeping these legacy systems? Why aren’t they just turning them off and throwing them away? It’s because they need access to the data. Which customers have purchased what products when, and what maintenance has been done? Have the customers paid their bills? There are a million reasons why they might need access to older records.

Third, an audit is vitally important. An auditor might show up and demand to see previous years’ data. And if you can’t show them that data, then you are going to be in trouble very quickly. So, you need to know exactly what data you must keep and for how long.

See also: How CSPs Can Mirror the Success of Hyperscalers

Accelerating the shift

To limit the expense associated with moving vast amounts of data to a new hyperscaler environment, companies should archive data from their production system before the migration. The archived data will be compressed, significantly reducing the overall data footprint. And when companies archive their data, they should keep archiving. This is equally crucial when switching to a hyperscaler environment because expenses are directly correlated with database size.

Customers should decommission old systems as part of their hyperscaler migration to overcome the difficulties associated with transferring those systems to cloud environments. This will guarantee that their historical data is preserved in a safe, contemporary setting.

Secure, safe, streamlined data

The transition to the cloud has become a critical strategic move, but organizations must approach this migration with careful consideration. Streamlining data and systems before the move not only facilitates faster and smoother migrations but also reduces costs and mitigates risks. Factors such as data optimization and software requirements must also be taken into account to achieve the ultimate goal of benefiting from a hyperscaler.

The growing demand for cloud adoption and the move to S4 among the SAP community are driving organizations to prioritize the decommissioning of legacy systems and the retention of access to essential data. By archiving data and decommissioning old systems, companies can accelerate their shift to the cloud while ensuring the preservation of valuable historical data in a modern and secure environment.

Robert Reuben

About Robert Reuben

Robert Reuben is the managing director of Proceed Group. He has extensive experience in enterprise IT, with successful roles at SAP and IBM, most recently within the SAP Platform and Technologies team, including leadership and management positions with both European and Worldwide responsibility.

Leave a Reply

Your email address will not be published. Required fields are marked *