First check for Packaged Connectors (Delivered) for the third party directly which with a little tweak can help achieve the need.
Next check if there are any Core Connectors which are template based and can help achieve the need of the integration.
Next step should be to check if we can do it with a combination of custom report and Enterprise Interface Builder (EIB) .
If any of the above does not help meet the requirement, then use Workday Studio. This is the final place where we should be able to design and build any complex integrations.
Summary:
Packaged connectors - Delivered by Workday to connect with third party [End-to-End]
Core Connectors - Template based.
Enterprise Interface Builder - Custom Report or Web Service combination
Workday Studio - Multiple output files, Read from different sources , Exceptions etc.
A tenanted definition of an integration between Workday and an external system based on a template that provides the methodology for communicating data.
Reference ID
A unique identifier used to look up data for integration purposes.
System User
An account associated with and required to launch a Connector or Studio integration. Workday delivered integrations and custom integrations require a system user account for authentication and web service calls. A system user account is not associated with a person in Workday.
Integration Template
A collection of integration services that enables communication between Workday and an external system. Workday provides integration templates in categories such as Benefits, Financials, HCM, Payroll, Payroll Interface, Procurement, Recruiting, Security, and Settlement. Many of the delivered templates contain default values for attributes, as well as prompt values for attributes and maps, to define the integration further.
Integration Event
The record of an integration process. Every integration—current or past, involving the import or export of data, successful or not—gets recorded as an integration event. The integration event contains all the information about the integration process, including its status.
Connector
A set of one or more integration templates that provide a framework for building integrations in a particular functional area. The integration can support a specific type of data, or can support a specific endpoint ( vendor, legacy system, third party payroll )
Enterprise Interface Builder (EIB)
An integration tool that enables us to create simple, secure, and customizable integrations with Workday. Alternately, an EIB is a simple integration created by the integration tool. An EIB consists of an integration system, an integration data source, an integration transformation, and an integration transport protocol.
Integration Field Overrides
A service that lets us customize integration systems that are based on a connector template. Field overrides are managed through an integration service. They use calculated fields or report fields to supply values to an integration system. Example: member IDs in benefit provider integrations.
Integration Attribute
An integration component that specifies the tenanted value of a data element in Workday. Example: Master Policy Number is a type of attribute in benefit provider integrations.
Integration Data Source
Indicates the type of data that Workday receives from or exports to an external system and its location.
Workday Web Services
Workday’s public API. Based on open standards, Workday Web Services (WWS) provide the core method for integration with Workday.
Integration Map
An integration component that specifies how values in Workday map to values in an external system. Example: Pay Rate Frequency is a type of map in third-party payroll integrations.
Integration Service
A group of related integration attributes, maps, and XSLT that provides a framework to transform Workday data into the format required by an external system.
Integration Transformation
Converts data into a format that Workday or a receiving external system can understand. Workday provides some delivered transformations, and we can also create custom transformations.
Integration Transport Protocol
Controls how Workday exports data to an external endpoint or service or imports the data from an external endpoint or service. Workday supports several types of transport protocols, including email, FTP and SFTP, HTTP/SSL, Workday attachments, and Workday Web Services.
Workday Studio
An Eclipse-based development environment that enables us to build more complex integrations with Workday.
Workday Studio is one of the 3 ways of building your integrations. (Other two are EIB and Core Connector)
Below are some of the details.
Cloud Explorer :
This is the place where we connect studio to the Tenant. It could be any tenant. Need to pass on Tenant Name, User ID, Password in order to get connected.
Preferences > Workday > Connections
** Be cautious when right clicking on the project / Integration, Don't tend to click on remove, which will remove from the server / tenant.
Project Explorer :
This is the preview of the work space or local file system. Each folder is an integration. Clicking on Assembly will open the design . Outline:
Gives the complete picture of the integration, where we can navigate easily. Check out for different views by clicking the 4 different icons in Outline window.
Design | Tree Structure | mVal | Props(properties) and Variables
Schema Explorer: We can add WSDL or WSD / We can add WWS (Mostly used for) / You can add Custom Report schema (Raas) / XSLT
3 different perspectives in seeing the applications -- WD | Debug | Design Report (BIRT)
Dash boarding and mobile visualizations provide interfaces to the entire ecosystem allowing users to consume, analyse and respond.
Analytics and BI
The analytical engines allow businesses to derive actionable insight into the health of their investment strategies and supporting operating platform.
Workflow
Workflow handles and optimizes high value business processes, providing integration points across users and applications, whilst unlocking process level efficiency metrics.
Transaction Engines
These are the core processing engines such as CRM, accounting, treasury, risk etc. which are fit for purpose and leveraged effectively
Robotics
Robotics process automation is the emerging, tactical solution for activities that were historically candidates for business process outsourcing.
Integration and SOA
Exposing and standardizing data as services and APIs allows the combining and leverage data from master sources in real time
Oracle Workforce Structure should be configured before the employee conversion.
HCM Organizations (Business Units , Legal Employers ,Departments)
Locations ( Employee assignment address)
Actions ( Hire, Re-Hire, Data Change, Assignment Change )
Jobs
Action and Action reasons for Assignments need to be mapped from source to target or needs to be created as per the requirement.
Employee types to be converted from legacy system need to be determined. eg: Permanent, Contractors.
All the active employees will be converted as part of the conversion.
Assignment number for Employees will start with “E” (E001) and Work Terms with “ET”(ET001).
Assignment number for Contingent Workers will start with “C” (C001) and Work Terms with “CT”(CT001).
Oracle HCM accepts only one company transfer per day. So for employees having more than one company transfers in a day, data needs to be cleaned up in the source system or new logic needs to be implemented during conversion.
Employee User/Role assignment
Auto Role Provisioning needs to be finalized prior to employee conversion in order for employees to login into Oracle cloud. Generally “Employee” is the default role assigned to all converted workers.
User Account naming convention needs to be decided for SSO (eg: Email id, Employee id , combination of First name and Last Name )
Contractors requiring access to self service also need to be converted as users.
Person Number Generation/Assignment Option:
Terminated Employees Conversion:
Decision will be made as per the following scenarios whether terminated workers will be converted from legacy system to cloud
Many organizations are moving to Oracle Fusion cloud services, taking advantage of the enhanced levels of support and overall reduction in cost of ownership the product offers. If your organization is preparing to implement Fusion ,in this post, I will focus on one of the big ones, the impact of which can often be underestimated: Data Conversion for Oracle Cloud ERP.
No matter which system you’re moving away from (PeopleSoft, Oracle EBS, etc.), exporting the data you need and loading it to Fusion will be a tedious and time-consuming effort (just as any data conversion effort from one system to another would be). However, with proper planning it can be executed logically and methodically, minimizing potential roadblocks and the risk of unpleasant surprises.
Because Cloud ERP is a SaaS product, you will not have access to update any of the tables yourself and instead must leverage the data conversion tools Oracle provides for populating Fusion with your organization’s data. The primary tool you’ll work with is File-Based Data Loader (FBDL), HCM Data Loader (HDL) and it’s critical to take the time upfront to understand how the tool works and which business objects it supports .
I will be following up on a few data conversion approaches that can be taken to reduce time and effort spent on converting data with the use of open source tools. We can use the tools to do data profiling of the data upfront to report on data quality and build conversion logic using ETL based approach to make the transition smooth during extracting data.
The Transaction Design Studio is available within the HCM Experience Design Studio in its initial BETA phase. There are limited features and actions available to configure at this time. Transaction Design Studio It allows you create rules to configure transactions and pages in the responsive user designed pages. You can change how sections and fields are displayed, based on the user’s role and the employee’s business unit or legal employer. You can:
Control the visibility of regions and sections on the page.
Control the visibility for attributes within a page, region, or a section.
Change the required status of optional attributes.
Control the availability of the questionnaire page for actions that use the guided process design.
You can create one or more rules for any page available in the Transaction Design Studio to manage your business needs. For example:
Make different fields visible and required in the new hire flow for employees in the US and employees in other countries.
If employees in the US don’t get salary increases as part of a promotion, hide the salary and compensation regions for US employees only, while making these regions available for employees in other countries when being promoted.
Hide the Ethnicity and Religion fields from the Personal Details page for countries or legal employers that you don’t want to store that information. You can still modify the person spotlight and upload images for pages that use the HCM landing page design.
BUSINESS VALUE OF TRANSACTION DESIGN STUDIO
Using Transaction Design Studio is easy. The design is like any other newly designed responsive page for a seamless experience. And when configuring a page or transaction, the sections and fields map directly to the user-facing page so you know exactly what you’re looking at and configuring. Once you save a rule, you can quickly test it by accessing the page from global search or quick actions to see the results of your configured rule. There’s no more guessing what part of the page you need to edit or needing to use complex EL expressions to vary the page as when using page composer. You don’t need a technical resource to create or maintain your rules, so as business rules change, your HR analyst can make the change themselves. Really complex business requirements may require you to use page composer, but in general, configuring your pages for different populations of employees is just a lot simpler now.
It is now possible to Run Quick Retroactive Pay for the Single Worker task to enable the processing of Retroactive pay. This task adds the retroactive element entries and the pay period for which retroactive pay is applicable and may be added to an existing flow or a new flow.
Support for Latest Person Search on Classic Payroll Pages
The latest Person Search feature that was existing for responsive payroll pages has now been extended to the Classic pages in Payroll as well.
Read-Only Version of Manage Payroll Relationship
The Manage Payroll Relationship page can now be accessed in a Read-only mode.
Display Person Number on Personal Payment Method, Payslip, and Year-End Documents
Personalization can now be used to include the Person Number on the personal information page headers for the Personal Payment Method, Payslip as well as year- end documents.
Removal of Unprocessed Payroll Element Batches
When an element entry created by the HCM Data Loader is rolled back, the application also rolls back events and notifications that are not consumed or processed based on the following:
If the event has an unprocessed status, then the event is removed. However, the event is not removed if it has been already processed, such as by retroactive pay.
If the batch action created an element entry, then the event is removed. Other types of element entry events, such as an update element entry event, are not supported.
If the element entry has not been updated, then the event is removed. However, the batch event is not deleted if the element entry has been updated, such as a change in input value.
If the element entry is created by HCM Data Loader.
Display All Hours in Payroll Reports
The following reports have been enhanced with an option to report hours from Supplemental Earnings and other element classifications in addition to that from Regular or Standard Earnings and Absence earnings in earlier versions:
Gross-to-Net Report
Payroll Activity Report
Payroll Register
Payroll Balance Report
Payroll and Time Definition List of Values REST API
In 19C, REST APIs have been provided to configure real time and up-to-date list of values for selecting Payroll and/ or overtime period that is to be assigned to an Employee.
For the Payroll Definition list of values, use payrollDefinitionsLOV and for Time Definition List of values, use payrollTimeDefinitions
After a successful data migration or incremental updates using HDL, you must run set of processes to either creating supplemental data or optimizing indexes for better performance. It is critical to understand when and how to run these programs.
Post Conversion Processes
#
Program
Description
1
Synchronize Person Records
Notifies consuming Oracle Cloud applications, such as Oracle Fusion Trading Community Model, of changes to person and assignment details since the last data load.
2
Refresh Manager Hierarchy
For performance reasons, the complete manager hierarchy for each person is extracted from active data tables. The hierarchy is stored in a separate manager hierarchy table, known as the de-normalized manager hierarchy (PER_MANAGER_HRCHY_DN).
3
Update Person Search Keywords
Several attributes of person, employment, and profile records are used as person-search keywords. This process copies keyword values in all installed languages from the originating records to the PER_KEYWORDS table, where they’re indexed to improve search performance.
4
Optimize Person Search Keywords Index
Optimizes the index of the PER_KEYWORDS table to improve search performance.
5
Autoprovision Roles for All Users
Grant\remove roles based on current role-provisioning rules. It will evaluate ALL users in the system against role provisioning setup and hence resource intensive and may create lot of LDAP requests.
6
Send Pending LDAP Requests
Sends user-account requests to the LDAP directory. Run this process only when you want user accounts to be either created or updated.
7
Send Personal Data for Multiple Users to LDAP
Ensures that personal data held in your LDAP directory matches that held by Oracle HCM Cloud, post bulk updates. Fields to be synchronized are – first Name, last name, email and manager.
8
*Synchronize Person Assignments from Position
If using position management and PositionOverrideFlag is set to Y in worker.dat for assignment records. You may want to run this process before running refresh manager hierarchy process if line manager is also synchronized.(Full HCM implementations only and not for coexistence)
9
Calculate Seniority Dates
You can not create V3 seniority records using HDL, only updates are allowed. After loading worker object via HDL, you must run this process to create default seniority records for workers based on configured seniority date rules.
Auto-trigger Processes
For worker object, by default, these two processes run automatically when HDL completes:.
Refresh Manager Hierarchy
Update Person Search Keywords
You can prevent either or both of these processes from running automatically using a SET instruction in the Worker.dat file. This can be useful especially in cases where you have many batches for employee data conversion and by disabling the process run post every single batch can allow you to run these programs only once after completing all the batches.
When to run these post conversion programs?
Here is one of the best ways to plan your post conversion schedule.
Program
One-time Conversion
Incremental/ Ongoing updates
Comments
Synchronize Person Records
Yes -Run after loading person data
Yes – If there are changes to person records on a daily basis.
Refresh Manager Hierarchy
Yes (HDL will auto trigger)
Yes (HDL will auto trigger)
Run the program manually if HDL auto trigger is disabled.
Update Person Search Keywords
Yes (HDL will auto trigger)
Yes (HDL will auto trigger)
Run the program manually if HDL auto trigger is disabled.
Optimize Person Search Keywords Index
Yes
Yes
Autoprovision Roles for All Users
No
No
Auto Role Provisioning Rules should be configured BEFORE running HDL Worker loads. This would let user accounts to be created along with requests to provision roles automatically. Please do not schedule this process, instead run it manually when role provisioning rules are modified.
Send Pending LDAP Requests
Yes
Yes
You should run this job after bulk loading workers via HDL.It is also a best practice to schedule this job on a daily basis to take care of ongoing user access requests as well as processing future dated ldap requests.
Personal Data for Multiple Users to LDAP
Yes
Depends
Required only if personal data (name, email, manager) is updated in bulk via HDL or Spreadsheet loaders, etc) Best to run the process once after one-time conversion so you don’t need to worry about data loading dependencies such as initial load for basic information and separate batch for line manager updates.
Synchronize Person Assignments from Position
Depends
Depends
Depends? If you are using position management and full HR (not coexistence)Given that changes can happen regularly within a position, you’ll want to this process to run on a regular basis. If you are synchronizing the line manager, then it’s recommended to run this process daily as well.
Calculate Seniority Dates
Yes
Yes
You will run this job after initial conversion as well as ongoing worker load via HDL. Please review the section below to set appropriate value for batch size to avoid performance issues.
How to run these post conversion programs?
Synchronize Person Records
The job is resource intensive as it publishes events in SOA for consuming applications so please run this process after hours and using proper input parameters. Also note, this job will only process effective dated transactions i.e. it will pick up future dated hires only when that hire date becomes effective.
There are 3 input parameters for this ESS job.
1. From Date:
2. To Date:
3. After Batch Load: <Yes or No or Blank>. Set this parameter to Yes when you are running the process post HDL conversion.
The Job can be run in 2 ways.
Daily
Pass the system date in both the date parameters and “After Batch Load” parameter set to “No”.
For a specific Period
Pass the date range in the parameters. Example: From Date 12-Aug-2019; To Date 18-Aug-2019;
Note: The max. date range between from date, to date parameters should be 7 days. For the very first time, the date range will be accepted for more than 7 days. All subsequent runs with date range more than 7 days, the job will end up in Warning state. It doesn’t raise events at all.
Refresh Manager Hierarchy
A person’s manager hierarchy could be derived from active data tables, but the impact of that approach on performance is unpredictable. Therefore, the complete manager hierarchy for each person is extracted from data tables and stored in a separate manager hierarchy table. This table is known as the denormalized manager hierarchy. Whenever a change is made to a person’s manager hierarchy through the application pages, the change is reflected automatically in the denormalized manager hierarchy table. You use the Refresh Manager Hierarchy process to populate the denormalized manager hierarchy table when person records are updated using data loaders (e.g. HDL).
There is 1 input parameter for this ESS job.
1. Updated Within the Last N Days – e.g. 1 day if the job is scheduled on daily basis, and you can leave it blank for the initial run or reconciliation for the full refresh.
Update Person Search Keywords
There are 3 input parameters for this ESS job.
1. Batch Id: <Number Value or blank>. To execute the job for a particular HDL batch, enter only the Batch Id and leave all the other parameters as a blank. It will create/recreate the person keywords for all the people who are successfully loaded via that HDL batch load.
2. Name: <Person Name LOV or blank>. To execute the job for a specific person, pass only the person name and leave other fields blank.
3. After Batch Load: <Yes or No or Blank>. To execute the job in Delta\Incremental mode option, then pass only ‘After Batch Load’ parameter as ‘Yes’. It will create/recreate person keywords for all the modified or new people in the system. This option will work only when the delta (change) size is less than 20K. For delta >20k, run the job with All People option (i.e. do not pass any parameters)
Optimize Person Search Keywords Index
Run this job only when there is minimum load in the system. Generally it is better to run daily once during maintenance time. And if they have ‘Update Person Search Keywords’ ess job also scheduled to be run daily during maintenance time then it is recommended to schedule this job right after.
There are 2 input parameters for this ESS job
1. Maximum Optimization Time: < # of days or Blank>, default is 180 days
2. Optimization Level: <Full Optimization or Rebuild the Index or Blank>, default is Full Optimization . If you are rebuilding the index then optimization time is ignored.
Note- Recommendations will vary based on the employee population, system usage by the users, db usages , data loaders used , index fragmentation,’Update Person Search Keywords’ ess job runs/schedules etc.
Autoprovision Roles for All Users
Please do not schedule this process for a daily or regular run, instead do the manual run as necessary. You need to run this process only if there are changes in the auto provisioning rules setup. You need to run Send Pending LDAP ESS to actually process all the LDAP requests generated by this job.
There is 1 input parameter for this ESS job
1. Process Generated Role Requests: <Yes or No>. Set the Process Generated Role Requests parameter to No to defer the processing, deferring the processing is better for performance, especially when thousands of role requests may be generated.
Send Pending LDAP Requests
Most of the changes to users & roles are shared automatically by Oracle HCM Cloud to Oracle Identity Management, but you may need to run this ESS after mass updates (such as HDL worker load or Auto provisioning changes). Best to schedule this process on daily basis, so it can send the pending requests as well as future dated requests which are now current to Oracle Identity Management.
There are 2 input parameters for this ESS job.
1. User Type <All, Party, Person> default is All
2. Batch Size <A> A- Auto calculate the batch size. For e.g if you have 1000 requests to be processed in 10 batches then batch size is 100.
Synchronize Person Assignments from Position
You must schedule this process to run on a regular basis. If you are synchronizing the manager, then it’s recommended to run this process daily.There are 3 input parameters for this ESS job. 1. Past Period to Be Considered in Days – Number of days in the past to be considered for updating the attribute in the assignments. The default value is 30 days. For example, if you set this parameter to 60 days, then any assignment records with start dates during the previous 60 days are synchronized from positions. 2. Run at Enterprise Level <Yes, No> – Select Yes to run the process for the enterprise, or No to run it for a specific legal entity.3. Legal Employer <Legal employer name LOV> – Legal entity for which you want to run the process.
Calculate Seniority Dates
There are 6 input parameters for this ESS job.
1. Person Number: Comma separated list of person numbers you want the ESS job to run for, or leave it blank to run it for all employees.
2. Past Period in Days: default is 1 day. It determines how many changes are getting picked up by the process.
3. Include Terminated Work Relationships: It determines whether seniority Data of terminated workers will be generated or not.
4. Legal Employer: <Legal employer name LOV> – Legal entity for which you want to run the process.
5. Union: <Union LOV> – Union name for which you want to run the process.
6. Selected Seniority Date Rules: List of rules you want to populate for the workers.
Note – Past Period in days parameter will impact how many records are getting picked up for deriving Seniority Dates. If some records are not all modified for a longer time, then we need to run the ESS once, with a bigger number as the period. However when we run with a big number as the past period, it will impact performance of the ESS job. Multi threading is one option to improve the performance of the ESS job, and it is done by setting up batch size\chunk size using profiles option. The PER_EMP_SD_MAX_PROCESS_REC profile option is used to set the number of records processed per thread and subsequently that will control the number of threads used depending on the total records. Customer needs to create a new profile option through following steps.
Navigate to the task “Manage Profile Options”
Create a profile option with following details.
Profile Option Code : PER_EMP_SD_MAX_PROCESS_REC
Profile Option Display Name : PER_EMP_SD_MAX_PROCESS_REC
Application : Global Human Resources
Module : Employment
Start Date : 01/01/1951
Once profile option is created make the profile option Enabled and Updatable at Site Level
Now navigate to task ‘Manage Administrator Profile Values’
Search for newly created profile option (PER_EMP_SD_MAX_PROCESS_REC)
At the site level, set a desired value e.g. 10000 or 2000 etc. This is the chuck size which will eventually determine number of threads. (This value is to be decided by customer as per their requirement.)