Monday, May 13, 2019

Cloud ERP - Automate File Based Data Import using ErpIntegrationService

 

A lot of customers are past the point of adapting Oracle Cloud Applications and are moving to the next step of the lifecycle,  integrating Oracle Cloud Applications within their enterprise architecture. Oracle is currently providing a plethora of CRUD web services, REST and SOAP for integration.
But in some cases the only ability to interface data is still through File Based Data Imports (FBDI).

Using FBDI you generate an input file for import using predelivered Excel sheets by Oracle. These input files need to be uploaded to UCM. After which you start the Load Interface File for Import process to load the data into staging tables. After this you start the appropriate import process to load the data into the actual transaction tables. These are all manual steps.

With the ErpIntegrationService service you can now automate the process of
  • uploading the input file to UCM
  • initiate the Load Interface File for Import process 
  • initiate the import process

Where to start the full automation?
In order to automate this process from any backend system or middleware, you first need to extract the data and mimic the format that FBDI generates and in some cases zip the extract file.

Suppose we want to automate the import of Suppliers.
The FBDI template looks like this:


And generates following csv file:


In order to upload the extract (input file) to Oracle Cloud using the ErpIntegrationService you need to base64 encode the zip file that contains the csv input file.



Before you can invoke the importBulkData operation, you need to lookup the import process Path, Jobname and Parameters. These can be found via task Manage Custom Enterprise Scheduler Jobs for [module]. It is also a good practice to run the import process manually to see the actual parameter values.



Request
Using follow request you can now initiate the


   
   
     
         
            UEsDBBQAAAAIADGhcUlqdPKdYAAAAHkAAAATAAAAUG96U3VwcGxpZXJzSW50LmNzdnMOcnUMcdVxDQrwzCtJTS9KLMnMz9PLzC3ILypxKs3JdkksSdTR0VHQcc4vAoqBpXWCSwsKcjJTi3R0ggNc/VziHUNDPPyDPKNcXXRAav2AynUgWCESzIJAGABq4eUCAFBLAQIUABQAAAAIADGhcUlqdPKdYAAAAHkAAAATAAAAAAAAAAEAAAAAAAAAAABQb3pTdXBwbGllcnNJbnQuY3N2UEsFBgAAAAABAAEAQQAAAJEAAAAAAA==
            PozSuppliersInt.zip
            zip
         

         
            /oracle/apps/ess/prc/poz/supplierImport,ImportSuppliers
            NEW,N
         
         00
         #NULL
     
   


Response


   
      http://xmlns.oracle.com/apps/financials/commonModules/shared/model/erpIntegrationService//ErpIntegrationService/importBulkDataResponse
      urn:uuid:ed27f0b1-2feb-432a-ad09-2b5c0f6c90db
   
   
     
         1124309
     
   




Initiated processes


Imported Supplier

A few pointers
The element notificationcode can have the following values:


First digit 0 No notification
1 Email
2 Bell
3 Email & Bell
Second digit 0 Send in any case
1 Send on success
2 Send on error

The element callbackURL can be used to provide a (local) endpoint to a web service that will be invoked upon completion of the load and import sequence.

PsCIA - PeopleCode & File Pre-Processors


DDL for Tables created by PeopleCode & File Pre-Processors:

You must run the PeopleCode pre-processor to generate the database tables that Change Impact Analysis reads and analyzes.

The PeopleCode pre-processor generates two tables:
• PSCIAPCXREF
• PSCIAPCXREFTIME

You must have permission to create, delete from and write to these files. If you have full privileges, you can create these files prior to using Change Impact Analyzer.

****************************************************************************************************************
Note. Database administrators may consider assigning user IDs and passwords that grant read-only access to the database. Users need create and write access to these work tables used by Change Impact Analyzer.
****************************************************************************************************************

The File Pre-processor generates the following database tables that Change Impact Analysis reads to perform analysis.
• PSFILEPROCESSRUN
• PSSQLXREFDEFN
• PSSQLXREFITEM
• PSFILESQLXREFDEFN
• PSFILEXREFDEFN

Note. You must have permission to create, delete from and write to these files. If you have full privileges, you run the pre-processor and create these files beforehand. DDL for above tables for DB2/Unix, DB2/OS390:

-- Table 1 - PSCIAPCXREF

CREATE TABLE OWNERID.PSCIAPCXREF (
PROGSEQ INT NOT NULL
,REFOBJECTID1 SMALLINT NOT NULL
,REFOBJECTVALUE1 CHAR(60) NOT NULL
,REFOBJECTID2 SMALLINT NOT NULL
,REFOBJECTVALUE2 CHAR(60) NOT NULL
,REFOBJECTID3 SMALLINT NOT NULL
,REFOBJECTVALUE3 CHAR(60) NOT NULL
,REFOBJECTID4 SMALLINT NOT NULL
,REFOBJECTVALUE4 CHAR(60) NOT NULL
,REFOBJECTID5 SMALLINT NOT NULL
,REFOBJECTVALUE5 CHAR(60) NOT NULL
,REFOBJECTID6 SMALLINT NOT NULL
,REFOBJECTVALUE6 CHAR(60) NOT NULL
,REFOBJECTID7 SMALLINT NOT NULL
,REFOBJECTVALUE7 CHAR(60) NOT NULL
,OBJECTID1 SMALLINT NOT NULL
,OBJECTVALUE1 CHAR(60) NOT NULL
,OBJECTID2 SMALLINT NOT NULL
,OBJECTVALUE2 CHAR(60) NOT NULL
,OBJECTID3 SMALLINT NOT NULL
,OBJECTVALUE3 CHAR(60) NOT NULL
,OBJECTID4 SMALLINT NOT NULL
,OBJECTVALUE4 CHAR(60) NOT NULL
,OBJECTID5 SMALLINT NOT NULL
,OBJECTVALUE5 CHAR(60) NOT NULL
,OBJECTID6 SMALLINT NOT NULL
,OBJECTVALUE6 CHAR(60) NOT NULL
,OBJECTID7 SMALLINT NOT NULL
,OBJECTVALUE7 CHAR(60) NOT NULL
)
IN

-- Table 2 - PSCIAPCXREFTIME
CREATE TABLE OWNERID.PSCIAPCXREFTIME (
LASTRUN CHAR(120)
)
IN

-- Table 3 - PSFILEPROCESSRUN
CREATE TABLE OWNERID.PSFILEPROCESSRUN (
LASTREFRESHDTTM TIMESTAMP
,FPPSTATUS CHAR(10) NOT NULL
)
IN

-- Table 4 - PSSQLXREFDEFN
CREATE TABLE OWNERID.PSSQLXREFDEFN (
XREF_ID CHAR(128) NOT NULL
,XREF_SQLTYPE CHAR(1) NOT NULL
,DBTYPE CHAR(1) NOT NULL
,XREF_GROUPBY_CNT SMALLINT NOT NULL
,XREF_ORDERBY_CNT SMALLINT NOT NULL
,XREF_HAVING_CNT SMALLINT NOT NULL
,XREF_SUBQUERY_CNT SMALLINT NOT NULL
,XREF_OUTERJOIN_CNT SMALLINT NOT NULL
,LASTUPDDTTM TIMESTAMP
,SQLTEXT LONG VARCHAR
)
IN

-- Table 5 - PSSQLXREFITEM
CREATE TABLE OWNERID.PSSQLXREFITEM (
XREF_ID CHAR(128) NOT NULL
,XREF_SQLTYPE CHAR(1) NOT NULL
,DBTYPE CHAR(1) NOT NULL
,XREF_ITEMSEQ SMALLINT NOT NULL
,XREF_ITEMTYPE CHAR(4) NOT NULL
,XREF_RECNAME CHAR(30) NOT NULL
,XREF_VALUE CHAR(128) NOT NULL
,XREF_USAGE CHAR(4) NOT NULL
,XREF_CLAUSE CHAR(5) NOT NULL
)
IN

-- Table 6 - PSFILESQLXREFDEFN
CREATE TABLE OWNERID.PSFILESQLXREFDEFN (
SOURCE_FILE CHAR(40) NOT NULL
,FILE_TYPE CHAR(4) NOT NULL
,XREF_ID CHAR(128) NOT NULL
)
IN

-- Table 7 - PSFILEXREFDEFN
CREATE TABLE OWNERID.PSFILEXREFDEFN (
SOURCE_FILE CHAR(40) NOT NULL
,CALLED_FILE CHAR(40) NOT NULL
,FILE_TYPE CHAR(3) NOT NULL
)
IN

How To Plan For A Successful Data Conversion In Your Move to Oracle Cloud ERP

Many organizations are moving to Oracle Fusion cloud services, taking advantage of the enhanced levels of support and overall reduction in cost of ownership the product offers. If your organization is preparing to implement Fusion ,in this post, I will focus on one of the big ones, the impact of which can often be underestimated: Data Conversion for Oracle Cloud ERP.

No matter which system you're moving away from (PeopleSoft, Oracle EBS, etc.), exporting the data you need and loading it to Fusion will be a tedious and time-consuming effort (just as any data conversion effort from one system to another would be). However, with proper planning it can be executed logically and methodically, minimizing potential roadblocks and the risk of unpleasant surprises.

Because Cloud ERP is a SaaS product, you will not have access to update any of the tables yourself and instead must leverage the data conversion tools Oracle provides for populating Fusion with your organization’s data. The primary tool you’ll work with is File-Based Data Loader (FBDL), HCM Data Loader (HDL) and it’s critical to take the time upfront to understand how the tool works and which business objects it supports .

I will be following up on a few data conversion approaches that can be taken to reduce time and effort spent on converting data with the use of open source tools. We can use the tools to do data profiling of the data upfront to report on data quality and build conversion logic using ETL based approach to make the transition smooth during extracting data.