The data mapping focuses on the movement of data between systems.data mapping lead to greater consistency, which subsequently leads to greater productivity, reduce ongoing maintenance costs, improve readability of software and also making it easier for developers to understand new code more quickly. There are different methods to import your data into OpenERP.such as
Through the web-service interface
Using CSV files through the client interface
Building a module with .XML or .CSV files with the content etc
Directly into the SQL database using an ETL
OpenETL (Open Source Extract, transform, and load) is a framework in python that implements ETL concepts for data import,export and also performs some operations between import/export.ETL tools are trying to make importing even easier and to make it more re-usable as well as less subject to errors.Data mapping encompasses the extract-transform-load (ETL) facilities used for bulk data movement.It also includes mechanisms to support continuous movement of discrete records, rows between systems.
Data mapping involves matching between a source and a target.for example Two databases that contain the same data elements but call them by different names. A simple example of data mapping includes moving the value from a ‘customer name’ field in one DB to a ‘customer last name’ field in another DB. To do so, you have to define your ETL tool needs to know that you want to take the value from the source field ‘customer name’, to the target field ‘customer last name’
Extract is the first part of ETL process involes extracting the data from resource systems.Each system uses different data format and common data formats are relational databases and flat files.the main goal of Extracting part is to convert data in to single format which appropriate for transformation processing.
In transformation phase series of functions that applies on extracted data and produce data into end target.the datasource contain data in different format and transformation phase convert the data in to required format.some data sources require little or even no manioulation of data.in other cases the following transformation may be required to meet the business and technical needs of the target database
Selecting only certain columns to load (or selecting null columns not to load)
Translating coded values (e.g., if the source system stores 1 for male and 2 for female, but the warehouse stores M for male and F for female)
Encoding free-form values (e.g., mapping "Male" to "1")
Joining data from multiple sources (e.g., lookup, merge) and deduplicating the data
Splitting a column into multiple columns
More Data Cleansing Issues
The load phase loads the data into the end target, usually the (DW). Depending on the requirements of the organization, this process varies widely. Some data warehouses may overwrite existing information with cumulative information.As the load phase interacts with a database, the constraints defined in the database schema — as well as in triggers activated upon data load — apply (for example, uniqueness, referential integrity, mandatory fields), which also contribute to the overall data quality performance of the ETL process.