GDPR Declaration !!

I am not collecting any personal information of any reader of or visitor to this blog. I am using Blogger, provided by Google to host this blog. I understand that Google is using cookies to collect personal information for its Analytics and Adsense applications.

Monday, July 16, 2007

Data Migration / Conversion Strategies in ERP


Read my White Paper in IT Tool Box 

This article was written back in 2007 when On-Premise applications were the rage. The points mentioned in this article are universal. This article is even more relevant today with the advent of new technologies like AI, ML and IoT, which consume data from ERP. Hence the migration of accurate data is very important in the context of the new technology paradigm.

Introduction:

Effective and efficient migration of data is one of the cornerstones for the success or failure of an ERP implementation. Since a number of questions need to be answered and decisions need to be taken before loading the data into the ERP, lot of planning is required before the data is migrated into ERP.

However, in practice this does not happen. Since data migration is the last activity before go live of an ERP implementation, this very important activity is subject to acute time constraint. With the top management demanding the timely roll out of the ERP, planning for data migration is not given the due importance. This results in junk data going into the new ERP system thereby making the whole implementation suboptimal from day one.

The upshot is clogged database tables, reduction in disk space, performance issues, confused users who are entrusted to clean up junk data in the new system (without much familiarity with the product and the processes) leading to the GIGO phenomenon!. It is not long before the stakeholders lose faith in ERP.

Given this context, the article aims to provide a higher level view of the data migration strategy to be followed in an ERP implementation. The article illustrates the points using Oracle Applications as the ERP product owing to the author's familiarity with the product.

Overall considerations

One, start early. Ideally the data migration planning should start immediately after the first pilot demo of the application (CRP 1). Consultant should start working on the data template as a part of the solution design and continually update the template as the design evolves. Since the migration is a technical process (even though the functional consultant is the owner), it makes sense to involve the developer at the beginning of the process.

Two, decide on how to handle partially completed data. For example, there may be open purchase orders against which material is partially received. There may be vendor invoices which has been paid partially. There could be customer invoices that are collected partially. There could be advances paid to the vendor against which material has been received partially. Organization will need to decide on how to migrate these data into ERP.

Three, consider taxation. Local taxation requirements could pose a challenge. For example, in India we have what is known as TDS (Tax Deducted at Source) that is applicable on a Vendor Invoice as well as on Vendor Payments (Detailed article below). Application will automatically create TDS Invoices on Tax Authority as soon as the Vendor Invoices are generated. Since Tax authority is a Vendor, these invoices will be migrated as part of overall migration plan. Allowing the system to create TDS Invoices will lead to duplication. Consultant might decide to ‘Switch On’ the TDS feature after the data conversion is completed.

Data Migration Strategy

The data migration strategy consists of the following seven steps as shown in the diagram below:


The data to be migrated 

 

There are four groups of data that has to be migrated during an ERP Implementation. These data need to be entered in the given sequence due to data dependency. The four groups of data are:

Configuration data (Set up data): This group of data determine the general principles and processes to be followed by the ERP system. This is a one-time data entry and is entered manually.

Sub Master Data: These are the set of policies and rules followed in the organization. These include, among others, the payment terms, the delivery terms, the delivery methods etc. This is a one-time entry and entered manually.

Master data: The master data include supplier master, customer master, chart of account master (for each segment), assets master, item master, banks master, currency master, region master, country master, location master, tax master etc. The principle feature of the master data is the extensibility of data. These are transactional data and will need regular updates and maintenance. The mode of data entry will depend on the data volume. Predominantly, different tools are used to load the master data into the application.

Localization requirements form an important part of the master data. It has to be ensured that the templates being used to load master data collects the localization information relating to each master entity.

Since master data is subjected to regular updates, it is prudent to migrate the master data very close to the go live date.

Transaction Data: Completion of transaction data entry is normally called as ‘System Go Live’. The transaction data normally consists of the supplier balances, the customer balances, the trial balance, the assets master etc. Since the entry of open balances have financial implications, the above data entry is subject to a series of decisions. The decision include loading open balances versus loading both the open and closed transactions, the archiving requirement, whether to resolve the disputed transactions before or after data migration, whether to post the migrated balances to GL and finally the strategy for validation of data entered into the system. This is a one-time data entry and is normally tool based data load.

Timing of data load


As far as timing of data load is concerned each of the above data follows different time lines. Normally the set up data is completed and signed off before the conference room pilots are completed so that the UAT is conducted in an environment similar to the production environment. Since some of the setup data cannot be modified once it is entered (for example the Accounting Calendar), a lot of care need to be taken before entering the setup data in the system. It is a very good practice to follow a version controlled documentation for the setup data.

Since the sub masters are also one time data entry, these will follow the same process discussed for set up data mentioned above. However, since the sub masters are OU specific, separate data load process need to be used for each OU.

As discussed in the section on data, the master data are subject to continuous updates. Hence the strategy to be followed for master data consists of incremental load as the process moves from CRP1 through UAT. The primary objective of the incremental data entry is to test the completeness of the templates. This means that every load of the master data should be used to test the tools and the templates in addition to the data itself.

The transaction data can also follow the same data load process as the master data. Since the transaction data has accounting implications, it is suggested that every load of transaction data test the whole data load cycle including transferring to GL and validating the accounting entries.

Templates to be used



These are sharable components that can be used for data migration of different countries into the regional instance. The template should be designed to handle three sets of data.

The first set of data is mandatory for the database. These include the primary key values like supplier id, customer id etc. These are unique values.

The second set of the data is required for the effective performance of the application in the organization context. These are organization specific data and could include values like payment terms, delivery terms etc. These are not mandatory for the application to function.

The third set of data is the country specific data. Since the country specific data could be different across countries, the template need to be suitably modified for each country.

As was the case with master data, it is best to freeze the template at the initial phase of the project and test its utility at different phases of the project. This will help in identifying missing data requirements long before the final go live date.

Tools to be used


Four different tools can be used to load data into the oracle database. We could use different tools for different types of data.

Desktop Integration: Data Loader can be used as tool which can be used to load data into the oracle tables. Optimally this tool is used in cases where only a single window is used to enter data in case of a manual entry. The tool is easy to use and is very flexible. Some of the data that can be loaded using this tool include chart of accounts, and some of the sub masters. The advantage of this tool is that the consultant do not need technical expertise to load data into the system.

SQL Loader: This tool picks data from a data file in a specific format (‘.csv’) and loads it into the staging table. From here the data can be moved to the base application tables using SQL Scripts.

Application Standard APIs: These are program interfaces provided by the application vendor to load standard data into the base application tables.

Custom Built Interfaces: These need to be developed for non standard data that need to be loaded into the database

Custom Built Forms: The problem with all the methods described above relate to  requirement for technical knowledge. It will be a good idea to spent effort in developing custom built forms to load data into the database. This is similar to the ‘Import Manager’ tool available in some Applications. The effort spent on this will be well worth since this could act as a component which can be reused across implementations. The development of these forms is a separate project and is not covered within the scope of this document.

Each of these tools needs to be evaluated from the perspective of speed, scope of errors and data volume.

Migration related setups




Based on the strategy for data migration, a few setups need to be created specifically related to data migration. These setups can be end dated after the completion of data migration.

Creation of new account codes: A few opening balance intake accounts need to be set up. These accounts get one side of the balance from the sub modules and get the balancing value from the GL trial balance. The accounts include Payables Open Intake Account, Receivables Open Intake account, Assets Open Intake Accounts etc. It should be noted that the creation of the account codes will depend on the migration strategy (specifically whether the transactions taken into the system are going to be posted to GL or not).

Module related Setups: To ensure that the Opening Balance transactions are tracked and closed separately from regular transactions, it is better to create separate identifiers like transaction types, transaction sources etc in different applications. For example in the auto accounting setup in AR in Oracle, you may need to create separate transaction types or standard memo lines for taking in receivables opening balances.

Post Migration Verification and signoff


This is the most important step in data migration since this step transfers the ownership of the data to the customer. Each element of data migrated into the application should be documented by the consultant and accepted and signed off by the customer. The aspects to be considered include, for each element, the verification criteria and the supporting documents. It is better to create a separate folders to hold the migration data. The ‘Input’ folder will be owned by the customer. They will place the validated data in this folder. The consultant will pick this data, load it into the system and upload the ‘Output’ folder with the uploaded data and the supporting documents. Ideally supporting documents should be the standard application reports. For example, the supporting document for Supplier Master could be the Supplier Listing Report, for Item Opening Balance could be the Inventory Valuation reports and so on.

It is important to have individual super user sign off on each data element and the customer Project Owner signoff on the completion of data migration phase.

Data Archiving


Decision on data archiving is dependent on various factors including the statutory requirements of various countries, reporting requirements and transactional information requirements. For statutory requirements some countries are expected to maintain up to 7 years of data. Also, the organization may want refer to some older data for internal auditing and comparison purposes and finally users may want to refer back to more recent transactions.

Based on each of the above, the archiving period will vary. While statutory reporting may require 7 years of data, comparison reporting may call for 4 to 5 years of data and transactional requirements will call for the last one year's data.

In data archiving, the organization could follow any of the following strategies. The issue of data archiving is applicable to the intake of transactional data discussed earlier.

Do not load any past and closed data into the Applications. Import only opening balances: For AR and AP, import only the current balances in the respective control accounts. For any partially completed transactions, import only the open balances. Close out as many transactions as possible by paying off AP invoices for example. By making the data compact, this method is easier to apply and is intuitive. The major advantage of this method is that the organization can do a data clean up exercise before loading the data to Oracle. However the drawback of the above approach is as follows.

a. Historical transactions will not be available in ERP for statutory requirements: Some countries require 7 years of archived data. This information will not be available in Oracle if the above method is followed. The way to handle this is to either have an archive instance with the historic data or to back up the historic data in an archive databases and have some of the key reports prepared in discoverer pointing to the database.

b. Transactional linkages will not be available: Some of the current transactions may require the user to track back to some oldera transactions. In the above approach, the past transactional information will not be available

2. Load only the past data corresponding to the current transactions: In this method, load all the transactional data relating to current open balances. For instance, if you have a partially paid invoice, enter the original invoice and the partial payments into Oracle. Most of the time, this takes care of user’s requirements discussed above.

3. Load all the data from the legacy system to Oracle: There is no data cleaning in this method. This is more like a data dump from legacy system to Oracle. In addition to increasing the disk requirements, this leads to duplication of data as well as performance issues.

For options 1 and 2, the organization will have to consider an instance strategy for archive instance as a part of the detailed instance strategy. The instance strategy should involve the infrastructure as well as the MIS reporting linkages in addition to the archived data.

Final Outcome

The final output of the Data Migration Strategy discussions is a table as given below. This is a sample list


Conclusion

Configuration and Data migration are the two most important activities determining the success of an ERP implementation. An accurately migrated data is a demonstration to the stakeholders that the ERP system is a reflection of their current organization. This increases their confidence in the application and in turn is a key factor in the ultimate success of ERP implementation. In many instances, this important activity does not get the kind of serious attention that it deserves.

By following a well laid out data migration strategy, organizations can ensure an effective and efficient data migration.

5 comments:

Unknown said...

Great and informative article. Any advice on best practice around migration of historic/closed transactions part way through a financial year would be very helpful. We are migrating from 11.0.3 to 11.5.10. Thanks in advance Priti

V K Ramaswamy said...

I have added one more section on 'Data Archiving' strategy which attempts to partially answer your question. For a more detailed explanation, please wait for my article on 'Opening Balance Intake in Oracle'. I will try to update teh same soon.
In the mean time pl. go through the other articles in this blog and let me know your views. Also please inform your friends of this blog. This can grow only through user participation.

Saravanan (Saro) said...

Hi Ram,

Thanks for giving such an golden tips in Oracle Apps.

Basically, I am Oracle Apps Distribution Functional consultant. I would like to know how OTC and P2P affect accounting. I want that full flow with screeshot.

Even, I have gone through you tips in the blog.

If you explain me with screenshot, it will be more help full for us.

My email id is saroapps@hotmail.com

Regards,

R. Saravanan

erp solution said...

Hello.. Thanks for given helpful tips.. Its most important activity determining the success of an ERP implementation. Effectively migrated data is a demonstration to the stakeholders that the ERP system is an accurate reflection of their current organization.

Charles said...

Thank you so much for this nice information.

Digital Engineering Services

Data and analytics Services

Data Migration Services

Big Data Services

Advanced Analytics Solutions

IoT Analytics Services