Maximizing Your Data Migration Investment
Many organizations are migrating their data these days, but they aren’t always doing so in an efficient manner. This means they’re effectively leaving money on the table. Two key areas of consideration are data relevance and design. Read on to learn how the right approach to relevance and design can ensure you’re getting the most out of your data migration investment, both now and in the future.
When undertaking a data migration, one of the biggest inefficiencies is preparing your data beforehand. Data cleansing is key; you need to evaluate the data you have before you move it. This must be the first step in the process.
Evaluate for Relevance
The concept of data relevance is central to this process. Data relevance is an aspect of data quality and a measure of the amount of useful insight a data set can provide for a specific business objective. When you start to apply relevance, you often find you aren’t using a lot of the data you have.
For instance, if you have a million different materials and you’re preparing to do a migration, you might think you have to clean and load those million materials. But if you start to apply relevance, you can see which of those materials you have used in the last two years; it’s probably some small subset. If you haven’t sold, inventoried or purchased the material in the past several years, you should probably leave those records out of your new target system. In other words, you don’t need to carry all of your dirty laundry with you – especially if, say, a certain SKU has been out of production for years.
You can greatly reduce the amount of effort by just leaving behind the old data that’s no longer relevant or moving it into a data warehouse if it’s still needed for support or historical reasons. This concept is paramount, but you also don’t want to leave too much data behind. Sometimes you have to apply what can be fairly complex relevance logic to figure out what is actually relevant. Is the material Active in the system? Is it a component of a bill of materials (BOM)? Do you have open service contracts, sales orders, or purchase orders for this item? Is this a new material just created and hence, it doesn’t yet have any transactions or inventory?
However, if you get this wrong, you end up spending a lot of time fixing the data when you might not even need that data. Relevance within two years is a good rule of thumb because if a project takes a year, by the time you get to the go-live, half of that material might be irrelevant because the window has shifted a year.
You need to look at relevance early on so that you don’t spend your time fixing data you don’t need. And don’t fix transactional data, because you’re going to ship that purchase order long before you go live.
Getting Design Right: Enabling Flexibility Over Rigidity
The second common inefficiency or challenge when it comes to getting the most from your data migration has to do with design. Many data migration processes don’t allow for changes in design. Design is fluid – but many companies try to execute their data migration starting with a design they’ve determined on Day 1. But that design is probably going to change, because these are huge, complex projects.
Think how many revisions a writer must make to a short article, for instance. Now imagine documenting all of your business processes and technical requirements before you start a data migration and then not being able to make any adjustments. You need to tightly couple the migration technical specifications to the target system functional design so that as the functional design changes, it flows directly into the migration project, where it can be handled gracefully. These changes in the design must be expected and handled by the ETL process with minimal effort. On a very large, complex ERP migration it can be common for every field mapping to change about three times on average.
For example, a large global company decided that every business unit around the world would follow the exact same accounting practices. They had set up their financials and declared the design “complete.” When they began mapping their international business units into the new system, the design team was surprised to learn that many countries have very different, but very strict, accounting laws. They had to scramble to re-open the design and revisit many of the “global” decisions which had been made. That same conversation surely happened as they began rolling out to all the different countries where they operated.
But the global design should focus on the relevant data. How many payment terms do you need to configure as part of the new design? How many bins do you need in your storage locations? Those answers may be significantly different once you have eliminated obsolete materials, vendors, and customers from consideration. Clean, relevant data enables a much better, more efficient design.
Getting It Right the First Time
The bottom line with any data migration is that you have to approach it from a data-first mindset and get the right stakeholders involved. Most data conversion is an IT function, especially if you’re at a smaller company. However, your IT team is unlikely to fully understand all of the aspects of the data – especially when it comes to things like relevance. That’s where you really need input from the people who actually use the data. Getting your data migration right helps set in place a better foundation for future mergers and acquisitions, better data quality and governance practices. Clean, business ready data is also the critical foundation for data analytics and to feed your AI and ML initiatives.
Setting Yourself Up for Success
Data is truly a company’s primary asset. Taking a data-first approach to a migration project is critical to enabling efficiency in your functional designs, the cleansing efforts, and ETL process. Looking at your data first is a key driver to ensure you go live with 100% business ready data with the least amount of effort. Data migration is hard no matter what, but it’s even harder when it’s not done right.
About the author: John Munkberg is senior vice president of product management at Syniti, the leader in enterprise data management. John has been working in data for over 20 years, including 10 years on the road as an SAP Data Migration consultant. He has been focused on making processes more efficient, automating the repetitive, integrating systems, and reducing the effort needed to get work done. Moving full time into product development, John is now focused on making Syniti Migrate the best data migration solution to solve the world’s most challenging projects.
Related Items:
Avoid the Hidden Challenges of Data Migration
What IT Leaders Need To Know About Application Modernization
Cloud Migration Is a Boon to Operations. But It Creates a Perfect Storm of Data Challenges