Data Applications → Install and configure Oracle Cloud ERP v2

What's New

In the latest version of the data app made available in Q1 2024, the following changes have been made:

  • This app now leverages Incorta Oracle Cloud ERP V2 BICC connector which has orchestration with jobs in Oracle Business Intelligence Cloud Connector.
  • This app can now use View Objects (VOs)

Configuration Prerequisites

Before you configure the Oracle Cloud ERP data application, your Oracle Cloud ERP administrator will need to create a single user with the correct roles to access the Oracle Cloud ERP Business Intelligence Cloud Connector (BICC) and Universal Content Management (UCM). Select the following links for instructions:

Note | Ensure only one user who can access BICC and Oracle Universal Content Manager (UCM). If OCI storage is used instead of UCM, then please refer to this link for information.

Installation and configuration steps

Here are the installation steps:

  1. Import the BICC Customization File
  2. Install the Data Application
  3. Setup the Fusion Connection
  4. Load Data into Incorta
  5. (Optional) Load Data into Google BigQuery
  6. Configure the Cloud ERP Application
  7. Adding new fields to existing tables
  8. Adding new VOs in Incorta Jobs

Refer to frequently asked questions and troubleshooting document if needed.

For any additional questions, please refer to our FAQs or contact our applications team.

Deployment use cases

When approaching the installation, it's important to understand the end consumption. The architected patterns of use for this data app are:

  • Using data application in Incorta only. Data will be ingested into Incorta for analytics
  • Using data application in Incorta to manage workloads for Google BigQuery. Data will be ingested into Incorta and flattened outputs will be provided to Google Bigquery for analytics. For this use case, be sure to complete the BigQuery destination step.

Cloud installation On-Premise istallation

Installation Overview

Cloud installation summary

  • Import the BICC customization zip file in Oracle BIACM
  • Install the Oracle Cloud ERP data app from the Incorta Marketplace
  • Install the Fusion BICC connector from the Incorta Marketplace if required
  • Configure the Fusion data connection

On-Premise installation summary

  • Import the BICC customization zip file in Oracle BIACM
  • Download and unzip the Oracle Cloud ERP archive; it has the tenant zip, BICC customization zip file and a data folder.
  • Import the Incorta tenant and metadata files and do configuration
  • Install the Oracle Cloud ERP BICC connector from the Incorta Marketplace
  • Configure the Fusion data connection

Step 1: Import BICC customization zip file

Cloud and On-Premise must complete

  • Log into Oracle BIACM. The URL will look like this one: https://sampledomain.fa.ocs.oraclecloud.com/biacm. Change the hostname to your own.
  • On the right-hand side, under Manage Offerings, import the customization zip (E.g. exportCustomization_2020-04-06_18_01_40.zip) file.
  • Click on the right-hand side icon to go to Manage Jobs, and here you will see the jobs that the customization file has seeded (e.g. Job_Fusion_AP). OBICC.png
  • Refer to the Offerings Matrix table below to see which objects are needed to deploy a particular module. You may need to Reset the last extract date at the job level if the customer has already run extracts in the past
  • Log into UCM. The URL will look like this one. Just change the hostname to your own: https://sampledomain.fa.ocs.oraclecloud.com/cs
  • In the UCM screen, click on the ‘Search->Advanced’ tab in the left-hand bar, and in the Security Group, put OBIA% and hit search. Now you can input a string like invoiceheaderpvo in the field at the top right and search for the file.

Step 2: Install the data application

Cloud installation steps

If you have access to install or request the data application in Incorta Analytics:

  1. Sign in to Incorta Analytics.
  2. Select Marketplace in the navigation bar.
  3. Select Applications from the submenu bar.
  4. Search for and select Oracle Cloud ERP data app (BICC).
  5. Select Install.

On-premise installation steps

Installing on a new tenant

  • Import the Incorta Tenant (this will be part of the data app install in the future)
  • Log in to the Incorta CMC and make sure that the loader and analytic services have at least 6GB of on heap memory. If not, please edit the on heap settings and restart the services
  • Click on Clusters>Cluster Name>Tenants, select + and then select the Import Tenant option and select the TenantOracleCloudERP_<ver>.zip file
  • Follow the guided steps to create the tenant and login to the newly created tenant.

Installing on a pre-existing tenant

  • Import schemas/business schemas/dashboard folders associated with the data app.
  • Create a new Oracle Cloud applications data connection called Fusion and configure as per the screenshot below.
  • Ensure that Spark has adequate memory configured and that the following Spark Materialized View properties are configured in the CMC. Depending on the data, the application memory may have to be set higher. ServerConfigurations
  • Use the Incorta interfact to navigate to Data > Local Data files
  • Select New and upload the files from your local mydata folder in the extracted zip file, making sure to use the overwrite option. LocalDataFiles

Step 2: Configure the Oracle BICC and UCM

  1. Sign in to the Oracle BICC with the following URL and your host name: https://<YOUR_HOST_NAME>.oracledemos.com/biacm/.
  2. In the toolbar, select Manage Offerings and Data StoresImport Modifications.
  3. In the Import Modifications dialog box that pops up:
    1. Select Choose File.
    2. Select the exportCustomization_<DATE>_<VERSION>.zip file from OracleFusion_<VERSION_NUMBER>.zip.
    3. Select Open.
    4. Select Import.
  4. Select the Offering drop down list to view the Incorta offerings, such as Incorta AP.

Step 3: Set up a Fusion connection

  • Log in to Incorta and create or edit the Fusion data connection.
  • Choose which storage type (UCM or Object Storage) you want get your data from:
    • For UCM storage (refer to screenshot below):
    • Common fields:
      • File name pattern : file_<catalog>_<schema>_<table>-batch*.zip
      • Root Query Text: dSecurityGroup <matches> `OBIAImport`
      • Leave other options with the default values or if you want to read only the files from a certain date (as those files may have extra fields than older ones), you can add a date filter at the data source of the field Start Timestamp: >= '2024-01-01 12:00' which will only load files created/modified since this date.
      • BICC Jobs : Here you have to configure the BICC Job names and the job IDs. If you have imported the tenant, then the only thing you need to verify are the job IDs for each of the jobs in BICC as those might have changed in the customer deployment.

DataSource

The job names are what is set up in the schema tables, for example:

EditDataset

Step 4: Load Data into Incorta

  • From the scheduler tab either create a Load Plan, add the schemas by referring to the Matrix above, or use an existing load plan. LoadPlan

  • For the initial full load, login to the Oracle BICC, navigate to the 4th icon on the right and click on Manage Jobs. BICCExtract

  • Then for every schema in the load plan above click on the gear icon on the right and click on Reset to full extract.

    BICCLoadJob

  • In Incorta execute the load plan and go to the Schema Load Jobs to monitor the execution. Incorta will automatically invoke the BICC JOB API, run the BICC extract and then stream the data file from UCM into Incorta and load the data.

  • Subsequent load plan executions after the full load will run in incremental mode .

  • Follow the BICC steps above if you again want to reset to full extract. If you want to check the Job execution status in BICC then click on Manage Job Schedules

LoadSchedules

Step 5 (optional): Send data to Google BigQuery

CloudDataWarehouse

You will need to complete a couple of steps to prepare it to push your EBS data to BigQuery.

At a high level, you will follow these steps to get Incorta talking to BigQuery:

  1. Configure Incorta to write to Google BigQuery via a Data Destination
  2. Load the schemas to push the source data first to Incorta and then to BigQuery
  3. In the Incorta UI navigate to Data->Data Destinations and find the GoogleBigQuery data destination. Refer to the Connecting and Pushing Data to BigQuery article available on Incorta Community for instructions on how to configure it to connect to your BigQuery instance. BigQueryDestination

Now we need to open the required schemas and set up the GoogleBigQuery data destination to the schemas and set the target schema for example to ORACLE_ERP_CDC. When the schemas are loaded, Incorta will create the dataset in BigQuery and all the tables will land there. Please verify that these settings have been configured for all the schemas that are sent to BigQuery. SchemaDestination

Offerings Matrix

ModuleSeeded Incorta JobsIncorta Schemas needed
Advanced Payables (AP)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON,
Job_Fusion_AP,
Job_Fusion_PO
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_AP,
Fusion_PO
Accounts Receivables (AR)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_AR
Job_Fusion_AR_SNP
Job_Fusion_AR_MV
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_AR
Fusion_AR_SNP
Fusion_AR_MV
Fixed Assets (FA)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_FA
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_FA
General Ledger and Sub ledgers (GL & XLA)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_GL
Job_Fusion_XLA
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_GL
Fusion_XLA,
Fusion_XLA_MV
Inventory (INV)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_INV
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_INV
Order Management(ONT)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_ONT
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_ONT
Procurement (PO)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_PO
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_PO
Projects (PRJ)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_PRJ
Job_Fusion_GMS
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON,
Fusion_PRJ
Fusion_GMS
GL Budgetary ControlJob_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FND_Common
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_Budgetary_Control
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_FND_Common
Fusion_FIN_Common
Fusion_PARTY_COMMON
HCMJob_Fusion_CAL_COMMON
Job_Fusion_HR_Common,
Job_Fusion_HCM
Job_Fusion_HCM_REC
Job_Fusion_HCM_Payroll
Fusion_CAL_COMMON,
Fusion_HCM
Fusion_HCM_REC,
Fusion_HCM_Payroll
Procurement(Sourcing)Job_Fusion_CAL_COMMON
Job_Fusion_Item_Common
Job_Fusion_HR_Common,
Job_Fusion_FIN_Common
Job_Fusion_PARTY_COMMON
Job_Fusion_Sourcing
Fusion_CAL_COMMON
Fusion_Item_Common
Fusion_HR_Common,
Fusion_PARTY_COMMON

Step 6: Configure the Cloud ERP Application

Incorta data application has pre-built metadata that can be used to load data from Oracle Cloud ERP and view it in pre-built dashboards. This is a modular accelerator data application for Financials/Supply Chain/HCM analytics that can be used for rapid deployment and can be easily customized.

Configure Seeded Session Variables

Configure the default values of these Incorta session variables which are used in dashboards.

SessionVariables

Default_Ledger: You can get all ledger names from the Fusion_FIN_Common.Ledger table and then ask the customer for the default

Default_Period: You can get all period names from the Fusion_FIN_Common.GLFiscalPeriod table and then ask the customer for the default

Default_Start_Year: Update the year value as required

Other Optional Configurations

Configure GL Account KFF

Determine which Chart of Accounts should be used for reporting

Most customers will already know which chart of accounts they want to use for their reporting. Customers can query the Fusion_FIN_Common.Ledger table by creating an insight in Incorta to determine which chart_of_account_id to use. ChartofAccounts

Get the list of segments used by customer from Fusion UI

In most cases customers will know which segments they want to use for reporting. Customers can look up the information from the Fusion UI as follows:

  • Login as a user who can access the “Setup and Configuration” option
  • Go to Setup and Maintenance for Financials and go to FInancial Reporting Structures
  • Go to Manage Chart of Account Structures AccountStructures
  • Type “GL” in as the Key flexfield code and press search FlexFieldGL
  • Click on “Manage Structures” and press search. This will bring up a list of accounting structures ManageStructures
  • Edit the structure the customer wants to use for reporting by clicking the pencil icon
  • This will show the mapping of account segments. Use this information to update the Incorta objects to reflect the customer’s implementation of the account key flexfield FlexFieldAccountegments
Update the Business Views to reflect the customer’s implementation

Once the segments needed for reporting are identified, update the business view to reflect the customer’s configuration.

In the FusionGeneralLedger Business Schema make sure that the correct segment columns from the code combinations table are mapped to the Entity, Cost Center and Account segments fields in the business schema. For example, out of the box, Incorta has mapped segment2 to cost center but in a customer’s instance, it could be segment3 etc.

FusionBSGL

If additional business views are needed for additional qualifying segments (only 3 are supported in the OOTB blueprints), add new business views. For example, create a new business view called product and put all the columns in that view for that segment. If the segment has no hierarchy, then just add the <segment name>_c and DESC_<segment_name> columns into the business view.

If the segment has a hierarchy, the hierarchy is already flattened in the VO. Add all the columns for each level in the hierarchy to the business view. Example: Product view - add 2 columns per level like product 1-5.

Update the join between the Fusion_XLA.XLAAccounts table and the Fusion_FIN_COMMON.CodeCombination table

This Incorta table holds the accounts per module like AP, AR etc… (used in the GL reconciliation dashboards).

  • Edit the table and select the correct segment field for the account. The default value that comes out of the box is segment3 . Save and load the table.
  • The correct segment, based on the customer’s configuration, from the Fusion_FIN_Common.CodeCombination table must be joined to the account segment. For example, if segment2 is the accounting segment, then update the join below to segment2. XLAjoin
Update the chart of accounts in the MV GLAccountHierFlat in the Fusion_FIN_Common schema

Update the chart of accounts and tree code in the MV named GLAccountHierFlat in the Fusion_FIN_Common schema as shown below to reflect the customer's chart of accounts and tree code. Also check and update the segment columns used. Here are the questions you can ask customer to determine the parameters used in the Incorta GLAccountHierFlat MV:

  1. What is the name of the account tree that you want to use?
  2. What is the name of the account tree version that you want to use?
  3. What is the fnd value set category name for the GL account hierarchy nodes?

You will use the answers to the above questions to update the MV with the values specific to the customer’s implementation.

  • If a customer has other GL segment hierarchies they want to bring into Incorta, e.g cost center hierarchy, they must create similar MVs and use the same SQL, but replace the parameters with values specific to those hierarchies.
  • They will also create joins to the CodeCombination PVO to this new MV similar to the GL account segment hierarchy.
  • They will also need to create one alias per hierarchy if they have multiple hierarchies for the same segment (using the hierarchy name as a filter in the join).
  • Note that the hierarchy name cannot be dynamically selected from the dashboard as Incorta joins are computed at load time.

MVQuery

Example: Edit the SQL in the MV and replace the right tree version name, the right code combination segment to join on (in our example we use segment 3 for account) and the right valuecategory for the segment. These are highlighted in the SQL below.

SELECT
gsh.dep31pk1value || ' - ' || lvl1.DESCRIPTION AS account_level1,
gsh.dep30pk1value || ' - ' || lvl2.DESCRIPTION AS account_level2,
gsh.dep29pk1value || ' - ' || lvl3.DESCRIPTION AS account_level3,
gsh.dep28pk1value || ' - ' || lvl4.DESCRIPTION AS account_level4,
gsh.dep27pk1value || ' - ' || lvl5.DESCRIPTION AS account_level5,
gsh.dep26pk1value || ' - ' || lvl6.DESCRIPTION AS account_level6,
gsh.dep25pk1value || ' - ' || lvl7.DESCRIPTION AS account_level7,
gsh.dep24pk1value || ' - ' || lvl8.DESCRIPTION AS account_level8,
gsh.dep0pk1value || ' - ' || lvl9.DESCRIPTION AS account_level9,
gcc.CODECOMBINATIONSEGMENT1,
gcc.CODECOMBINATIONSEGMENT2,
gcc.CODECOMBINATIONSEGMENT3,
gcc.CODECOMBINATIONSEGMENT4,
gcc.CODECOMBINATIONSEGMENT5,
gcc.CODECOMBINATIONSEGMENT6,
gcc.codecombinationid,
lvl2.FLEXVALUE lvl2_flexvalue,
lvl3.FLEXVALUE lvl3_flexvalue,
lvl4.FLEXVALUE lvl4_flexvalue
FROM
Fusion_FIN_COMMON.flex_tree_vs_gl_account_vi gsh,
(
SELECT
gcf.treestructurecode,
gcf.treecode,
gcf.treeversionid,
gcf.dep0pk1value,
MAX(gcf.distance) AS maxdistance
FROM
Fusion_FIN_COMMON.flex_tree_vs_gl_account_vi gcf,
Fusion_FIN_COMMON.fndtreeversiontlp1 tvt
WHERE
gcf.treestructurecode = tvt.treestructurecode
AND gcf.treecode = tvt.treecode
AND gcf.treeversionid = tvt.treeversionid
AND tvt.treeversionname = 'Account_Tree_Current1'
GROUP BY
gcf.treestructurecode,
gcf.treecode,
gcf.treeversionid,
gcf.dep0pk1value
) hier,
Fusion_FIN_COMMON.fndflexvaluespvo lvl1,
Fusion_FIN_COMMON.fndflexvaluespvo lvl2,
Fusion_FIN_COMMON.fndflexvaluespvo lvl3,
Fusion_FIN_COMMON.fndflexvaluespvo lvl4,
Fusion_FIN_COMMON.fndflexvaluespvo lvl5,
Fusion_FIN_COMMON.fndflexvaluespvo lvl6,
Fusion_FIN_COMMON.fndflexvaluespvo lvl7,
Fusion_FIN_COMMON.fndflexvaluespvo lvl8,
Fusion_FIN_COMMON.fndflexvaluespvo lvl9,
Fusion_FIN_COMMON.CodeCombination gcc
WHERE
gsh.dep0pk1value = hier.dep0pk1value
AND gsh.distance = hier.maxdistance
AND gsh.treestructurecode = hier.treestructurecode
AND gsh.treecode = hier.treecode
AND gsh.treeversionid = hier.treeversionid
AND gsh.dep31pk1value = lvl1.FLEXVALUE
AND gsh.dep30pk1value = lvl2.FLEXVALUE
AND gsh.dep29pk1value = lvl3.FLEXVALUE
AND gsh.dep28pk1value = lvl4.FLEXVALUE
AND gsh.dep27pk1value = lvl5.FLEXVALUE
AND gsh.dep26pk1value = lvl6.FLEXVALUE
AND gsh.dep25pk1value = lvl7.FLEXVALUE
AND gsh.dep24pk1value = lvl8.FLEXVALUE
AND gsh.dep0pk1value = lvl9.FLEXVALUE
AND gsh.dep0pk1value = gcc.CODECOMBINATIONSEGMENT3
AND lvl1.valuecategory = 'Account'
AND lvl2.valuecategory = 'Account'
AND lvl3.valuecategory = 'Account'
AND lvl4.valuecategory = 'Account'
AND lvl5.valuecategory = 'Account'
AND lvl6.valuecategory = 'Account'
AND lvl7.valuecategory = 'Account'
AND lvl8.valuecategory = 'Account'
AND lvl9.valuecategory = 'Account'
Load the MV GLAccountHierFlat in the Fusion_FIN_Common schema

Load the MV GLAccountHierFLat in the Fusion_FIN_Common Schema. This will refresh the data to reflect the customer’s configuration.

Update Reconciliation Incorta Analyzer table to reflect correct account segment column

The OOTB data app uses Segment2 for Account. If a customer uses a different segment column, the Incorta Analyzer table named XLA_Accounts in schema needs to be updated to reflect this. XLAAccounts

Configure Descriptive Flexfields (DFFs)

A flexfield is a set of placeholder fields that are associated with a business object. Oracle Cloud ERP Applications provides three types of flexfields — descriptive, extensible, and key. This section describes how to support a descriptive flexfield (DFF) like Attribute1 etc for a particular table in Incorta -

  • In the Oracle Cloud ERP there will be a separate VO object for a table’s flexfields. For example in the ContractHeaderP VO the flex VO is called: FscmTopModelAM.ContractsCoreAM.HdrDFBIAM.FLEX_BI_ContractHeaderFlexfield_V. Find out the name of the flexfield VO first.

  • Look at the section ‘Adding new Fusion ERP table in Incorta’ on how to import this VO into Incorta, please note the key fields; they will be like S_K_5000 etc…

  • Join this VO as a parent table to the base VO. Use the key columns of the Base VO to join the S_K_5000 columns in the flex VO. ContractHeader

  • Add the DESC_ field from the flex VO into a business schema and add the appropriate label . For eg in the contractHeader the end user dff column name may be called ProjectMarket and the flexfield value field in the VO would be called as DESCPROJECTMARKET . So we can add the label as ‘Project Market’

  • Load the VO

    • Note: We have included some DFF VOs in the Fusion_FLX_Common schema, since flex VOs are deployment specific you would need to extract the existing flex VO in BICC, then download the mdcsv file from BICC, upload to Incorta (as per the instructions in ‘Adding new Fusion ERP table in Incorta section) , and save the flex VO in Incorta to get the updated fields.
  • If no fields are selected in the flex VO please edit the VO in BICC , do a select all and save the VO.

  • If you don’t see some flex VOs in BICC then the customer needs to deploy the DFFs from the Fusion application.

  • For FlexFields to be available in BI, you need to do the following in the Fusion app:

    • For FlexFields to be available in BI, you need to do the following in the Fusion app:
      • Go to Manage Descriptive FlexFields task
      • Query the DFF CODE
      • Go to edit and find needed attributes
      • Edit the attribute, you will find a "BI Enabled" check box
      • Customer need to enable this check box, save it
      • Deploy that flex code
      • Run "Import Oracle Fusion Data Extensions for Transactional Business Intelligence" Ess job
      • Once the above is processed, the flexfields should be available for extract in BICC
      • Refer to steps 1-3 of this Oracle blog
      • For additional information on finding a desriptive flecfield DFF name, go here.
Seeded flexfield VOs in Fusion_FLX_COMMON schema
DESCRIPTIVE_FLEXFIELD_NAMEFlex VO NameBase VO (will be child)APPLICATION_TABLE_NAME
FscmTopModelAM.CustAcctInformationBIAM.FLEX_BI_CustAcctInformation_VICustAcctInformationVOHZ_CUST_ACCOUNTS
CE_EXTERNAL_TRANSACTIONSFscmTopModelAM.ExternalTransactionBIAM.FLEX_BI_ExternalTransaction_VIFscmTopModelAM.FinCeCashTransactionsAM.ExternalTransactionsPVOCE_EXTERNAL_TRANSACTIONS
ContractHeaderFlexfieldFscmTopModelAM.ContractsCoreAM.HdrDFBIAM.FLEX_BI_ContractHeaderFlexfield_VIFscmTopModelAM.ContractsCoreAM.ContractHeaderPOKC_K_HEADERS_ALL_B
ContractLineFlexfieldfscmtopmodelam.contractscoream.lndfbiam_flex_bi_contractlinedescflexfield_viFscmTopModelAM.ContractsCoreAM.ContractHeaderLinesPOKC_K_LINES_ALL_B
EGP_SYSTEM_ITEMS_DFFFscmTopModelAM.EgpItemsPublicModelAM.ItemBaseDFFBIAM.FLEX_BI_ItemBaseDFF_VIFscmTopModelAM.EgpItemsPublicModelAM.ItemEGP_SYSTEM_ITEMS_B
EGO_ITEM_EFF_BFscmTopModelAM.EgpItemsPublicModelAM.ItemEFFBIFlattenedVOEGP_SYSTEM_ITEMS_B
INV_MATERIAL_TXNSFscmTopModelAM.InxTxnDFFBIAM.FLEX_BI_InxTxnDFF_VIFscmTopModelAM.InventoryAM.InventoryTransactionDetailPVOINV_MATERIAL_TXNS
PJC_EXP_ITEMS_DESC_FLEXFscmTopModelAM.PjcEiBIAM.FLEX_BI_PjcEi_VIFscmTopModelAM.PjctransactionstAM.expenditureitempvoPJC_EXP_ITEMS_ALL
PJC_TXN_XFACE_DESC_FLEXFscmTopModelAM..PjcUnProcDFFBIAM.FLEX_BI_Pjc_UnprocDFF_VIFscmTopModelAM.PjcTransactionsAM.ProjectUnprocessedCostTransactionPVOPJC_TXN_XFACE_ALL
PJF_PROJECTS_DESC_FLEXFscmTopModelAM.PJF_PROJECTS_ALLBIAM.FLEX_BI_PJF_PROJECTS_ALL_VIFscmTopModelAM.PjfProjectAM.ProjectPJF_PROJECTS_ALL_B
PJF_TASK_STRUCTURE_DFFFscmTopModelAM.TaskDffBIAM.FLEX_BI_TaskDff_VIFscmTopModelAM.PjfProjectDefinitionAM.TaskVLPVOPJF_PROJ_ELEMENTS_B
POR_REQUISITION_HEADERSFscmTopModelAM.RequisitionHeaderPublicBIAM.FLEX_BI_RequisitionHeaderPublic_VIFscmTopModelAM.PrcPorPublicViewAM.RequisitionHeaderP1POR_REQUISITION_HEADERS_ALL
POR_REQUISITION_LINESFscmTopModelAM.RequisitionLinePublicBIAM.FLEX_BI_RequisitionLinePublic_VIFscmTopModelAM.PrcPorPublicViewAM.RequisitionLineP1POR_REQUISITION_LINES_ALL
POZ_SUPPLIER_SITES_ALL_MFscmTopModelAM.PozSupplierSitesDFFPublicBIAM.FLEX_BI_PozSupplierSitesDFFPublic_VIFscmTopModelAM.PrcPozPublicViewAM.SupplierSitePVOPOZ_SUPPLIER_SITES_ALL
PO_HEADERSFscmTopModelAM.POHeaderDFFPublicBIAM.FLEX_BI_POHeaderDFFPublic_VIFscmTopModelAM.PrcPoPublicViewAM.StandardHeaderPVOPO_HEADERS_ALL
PO_LINESFscmTopModelAM.POLineDFFPublicBIAM.FLEX_BI_POLineDFFPublic_VIFscmTopModelAM.PrcPoPublicViewAM.StandardLinePVOPO_LINES_ALL
RA_CUSTOMER_TRXFscmTopModelAM.TransactionHeaderFLEXBIAM.FLEX_BI_TransactionHeaderFLEX_VIFscmTopModelAM.FinArTopPublicModelAM.TransactionHeaderPVORA_CUSTOMER_TRX_ALL
RCV_TRANSACTIONSFscmTopModelAM.RcvTxnDFBIAM.FLEX_BI_RcvTxnDF_VIFscmTopModelAM.RcvReceiptsAM.ReceivingReceiptTransactionPVORCV_TRANSACTIONS
FscmTopModelAM.AssetCategoryDffBIAM.FLEX_BI_AssetCategoryDff_VIFscmTopModelAM.FinFaTrackDescDetailsAM.AdditionBasePVOFA_ADDITIONS_B
FscmTopModelAM.DeliveryDFFBIAM.Flex_BI_DeliverDFF-VIWSH_NEW_DELIVERIES
DOO_HEADERS_EFF_BFscmTopModelAM.DooTopAM.HeaderBIEffEFFBIFlattenedFscmTopModelAM.ScmExtractAM.DooBiccExtractAM.HeaderExtractPVO
DOO_LINES_EFF_BFscmTopModelAM.DooTopAM.FulfillLineBIEffEFFBIFlattenedFscmTopModelAM.ScmExtractAM.DooBiccExtractAM.LineExtractPVO

Supporting Drill to Cloud ERP Page

Incorta can support drill to the Cloud ERP provided a URL exists in which we can pass the ID, here are some sample URLS which can be configured in the URL field of a insight, we just need to modify the host name.

Purchase Requisitions

Purchase Requisitions: https://<host>/fscmUI/faces/FndOverview?fnd=%2FWEB-INF%2Foracle%2Fapps%2Fprc%2Fpor%2FmanageReq%2Fui%2Fflow%2FManageReqMainAreaFlow.xml%23ManageReqMainAreaFlow;paramInvokedApp%3DNOTIFICATION%3BparamReqHeaderId%{EBS_PO.PO_REQUISITION_HEADERS_ALL.REQUISITION_HEADER_ID};;;false;256;;;&fndGlobalItemNodeId=itemNode_my_information_purchase_requisition

AP Invoices

https://fa-engu-test-saasfaprod1.fa.ocs.oraclecloud.com/fscmUI/faces/deeplink?objType=AP_VIEWINVOICE&action=VIEW&objKey=InvoiceId={FusionPayablesTransactions.InvoiceHeaders.INVOICE_ID}

Please refer to the following Oracle documentation for further support.

Sizing

To figure out data sizing for Oracle cloud ERP please ask the customer to create a BI Publisher report using the below sql and send the Incorta Applications team (applications@incorta.com) the output:

select owner,table_name, num_rows , round(blocks*8/1024) size_mb from all_tables where owner not in ('SYS','SYSTEM') order by 1,2

Step 7: Adding new fields to existing Tables in Incorta

  • To integrate new fields into already existing tables in Incorta, go to BICC and under Manage Jobs on the right open the Job which has the VO object you are trying to update. BICAddTable
  • Click on the Datastore and then click on Select Columns on the top right BICCAddTablePt2

Then select the columns which you want to, hit Save and Done

  • In Incorta go into the schema where the table is and click on the Datasets, select Validate and then verify that the column has been added
  • Select Done and Save the changes and do a full load onto the table
  • Verify the values in the new field using Explore Data

Step 8: Adding new VOs in Incorta Jobs

Users will have situations where they want to bring in additional data from Cloud ERP systems. The main method to bring in Cloud ERP data into Incorta is through VO (View Objects).

We will discuss the high level steps below:

  • Identify the VO that has the data that is needed for reporting in Incorta
  • Refer to the Extract VOs

Create the table corresponding to the new VO being added in Incorta

  • Add the VO to an offering
  • Then add the VO to an existing job or a new job in BICC
  • Log into Incorta If required, add the job name and job ID in the Fusion data connection
  • Use the schema wizard to Import the VO definitions to create an Incorta table, set up the job name, and select Incorta as the value for the ‘Start BICC jobs Using’ option, as shown below. EditDataset
  • Create the Joins from child to parent. Create any formula columns if needed.
  • Load the physical schema in Incorta
  • Create or update a business schema
  • Create new dashboards on top of that business schema and validate the data and joins.

Find the VO in BICC. If VO is not found, add a new VO to BICC

Note that all VOs are not pre-seeded.

  • Log into Oracle BICC, search for those VOs. We can also add the VOs to a custom offering and we can just extract that offering
  • If the VO is not present you can add the VO, see steps and screenshots below VOinBICC If the VO is not found, you can add it by going to manage offering and Data Store on the right hand top corner as shown below. VOCloudConnector

Select the offering and click the + button to add the new VO to the offering. VOOffering Add the VO to be added: VOAdd

Select Save.

Select columns to be extracted in the VO you just added: VOColumns

Select the desired columns. Select save.

Data Security

To implement data security by org for Fusion, please review the below suggestion.

  • Please check these two OTBI subject areas below ,these subject have all the user and role info, more info in this Oracle document link
    • Security - Audit Real time"
    • Security - Roles and Privileges Real Time
    • Some VOs for these are:
      • HcmTopModelAnalyticsGlobalAM.AseAnalyticsAM.FndDataSecurityPVO
      • HcmTopModelAnalyticsGlobalAM.AseAnalyticsAM.OpssRolePVO
  • Define a excel file which has user and the list of BU they have access to
    • For example, the userid is the incorta user id
      • userId, org_id
      • U1, org1
      • U1, org2
      • U2 , org1
  • Upload the file in incorta and create a table in incorta using it and load it
  • Create a internal session variable called GetOrgId from the Schema tab with the following expression
    • query(<schema>.security_tbl.org_id,<schema>.security_tbl.userId=$user)
  • Add a security filter to the base transaction table you want to secure based on the following variable, this condition returns all records if the user is an administrator. or($user='demo’ ,inlist(<Fusion_schema.transaction_table>.org_id,$GetOrgId))
    • Keep in mind that the base table in the query plan is the driving table for the security so the filter has to go there. Refer to this article on how to secure the data.

Oracle OCI Object Storage

You can access BICC content in Object Storage.

  • Login to your Oracle cloud account
  • Navigate to Object Storage tab
  • View all the existing buckets associated with your region. OracleSecurity
  • Click on Profile on the top right hand side select your profile OracleSecurityProfile
  • Click API Keys on the bottom left hand side APIKeys
  • Add new API Key > Download the Private & Public key files APIKeysDownload
  • Click Add > copy Configuration file details ConfigFile
  • Create the configuration file

    * Make sure you have the private & public downloaded key in the same folder to be saved into incorta domain (eg. /IncortaAnalytics)
    * Start terminal session on the same directory folder of the keys
    * Type Code ```<cong_file_Name>```. It will open a text file
    * Paste the configuration file preview code and add the path of the private key file

    OCIConfig

  • Now configure the Fusion data connection with the details of Object storage

  • Object Storage:

    https://www.ateam-oracle.com/set-up-oracle-fusion-saas-business-intelligence-cloud-connector-bicc-to-use-oracle-cloud-infrastructure-oci-object-storage

    • For the Object Storage:
      • OCI configuration file path: the path to the configuration file for Oracle Object storage, containing information about the user, profile, tenancy ID and region.
      • OCI configuration profile: the profile you wish to access your data from. The configuration file provided can contain multiple profiles, each of which has its own authentication credentials, so the user should specify which profile to work with.
      • Bucket: the name of the bucket which contains the target data files. (a bucket is a logical container for storing objects. It can be perceived as folder in file system)

FAQs

What should I do if my BICC fails to load? Add a filter in the VO or consider using the chunking option.

How can I always do full load for a VO?

  • In BICC edit the VO and uncheck the option below then the VO extract. Now it will always be a full load.
    • Disable the incremental option in the Incorta table’s dataset tab and choose the ‘Load last batch’ option. FullLOadVO

How do I handle source deletion of rows?

  • Create a new Job in BICC for the particular VO
    • Create a new Connection and set the file type to PECSV and add the Job name and ID pair
    • Create a new Incorta schema for the PK table and add the PK table using the Schema wizard
    • Always do a full load of the above PK table. Make sure that the Incremental toggle is turned off for this table
    • Make the above table a parent of the base table by creating a child join from the base table to this PK table.
    • Apply a table security filter on the base table as - "a field of parent table is not null" .
      • Sample security filter on base table using the parent table field - not(isNull(Test.parent_table_with_keys.id))
    • Refer to this article for additional information.
    • For Google BigQuery, create a routine in BigQuery to delete these rows from the CDC table, you have to open a query window and run the create statement
CREATE OR REPLACE PROCEDURE incorta-se.Raw_Fusion_Example_Delete.APInvoiceHeaderDelete()
BEGIN
DELETE FROM Raw_Fusion_Example_Delete.APInvoiceHeader
WHERE InvoiceID IN (
SELECT APIH.INVOICEID
FROM Raw_Fusion_Example_Delete.APInvoiceHeader APIH
LEFT OUTER JOIN Raw_Fusion_Example_Delete.pk_invoiceheaderpvoAPIHD
ON APIH.INVOICEID = APIHD.INVOICEID
WHERE APIHD.INVOICEID IS NULL);
END;

For On-Prem, what is the error: Error: java.io.IOException: Cannot run program "chmod": error=24, Too many open files?

This issue happens because the connector build now has multiple jars, so at the time of start up, the server will try to load all the available connectors, which requires opening all the connector jars; this hits the OS limit of the allowed number of open files per process.

  • On the server run this command: ulimit -a output

  • Check the open files parameter, if it is 1024 then fix it by increasing the number of open files. Edit this file /etc/security/limits.conf and add the following lines:

    • soft nproc 65535
    • hard nproc 65535
    • soft nofile 65535
    • hard nofile 65535
  • Restart both the services.

How do I ingest an Oracle BIP report output into Incorta?

Use the Fusion V1 connector. Refer to this article for more information.

How do I delete a zip file in UCM?

Login to UCM, search for the file and click on the ‘i’ icon. Click on the Delete button.

Does Incorta support Incremental extracts for Oracle Cloud ERP? Yes

How do you keep the VOs updated based on the Oracle releases?

Oracle can add new columns to the VOs, so if needed, just open and save the table in Incorta.

How should I size the Incorta server?

There is a DB script for an Oracle database but many Fusion customers do not have a replicated Oracle Fusion DB as it is very expensive. If the customer only has access to BICC, then they need to ask Oracle to run the script in the Fusion DB and send the output.