Connectors → SAP ERP
About SAP ERP
SAP ERP is an enterprise resource planning system that is built for businesses of all sizes. SAP ERP has modules specific to industries, from manufacturing to retail. The software supports and integrates almost every functional area of a business process from procurement of goods and services, sale and distribution, finance, accounting, human resource, manufacturing, production planning to logistics and warehouse management.
About the SAP ERP Connector
The SAP ERP Connector loads data into Incorta from the SAP ERP Tables, Views, Advanced Business Application Programming (ABAP) Core Data Services (CDS) Views and HANA CDS Views.
The Connector is built using the SAP Java Connector (JCo) library, and uses a Remote Function Call (RFC) enabled Function Module program to perform the following activities:
- Receive Requests
- Generate Schema Definition
- Extract Data from Tables
- Log Extraction and Processing Status
The following are the SAP objects used to build the SAP ERP Connector function module:
Object Type | Object Name |
---|---|
Class | CL_ABAP_CHAR_UTILITIES |
Class | CL_ABAP_GZIP |
Class | CL_ABAP_MEMORY_UTILITIES |
Class | CL_ABAP_STRUCTDESCR |
Class | CL_ABAP_TABLEDESCR |
Class | CL_ALV_TABLE_CREATE |
Class | CX_ROOT |
Class | CX_SY_DYNAMIC_OSQL_SEMANTICS |
Class | CX_SY_DYNAMIC_OSQL_SYNTAX |
Function Module | BALW_BAPIRETURN_GET1 |
Function Module | CCU_TIMESTAMP_DIFFERENCE |
Function Module | RZL_READ_DIR_LOCAL |
Function Module | DDIF_FIELDINFO_GET |
Function Module | SUSR_USER_CHECK_EXISTENCE |
Function Module | SYSTEM_CALLSTACK |
Function Module | RFC_GET_ATTRIBUTES |
Structure | CNVCMIS_A_20_DATA_0064 |
Structure | CNVCMIS_A_20_DATA_0128 |
Structure | CNVCMIS_A_20_DATA_0256 |
Structure | CNVCMIS_A_20_DATA_0512 |
Structure | CNVCMIS_A_20_DATA_1024 |
Structure | CNVCMIS_A_20_DATA_2048 |
Structure | CNVCMIS_A_20_DATA_3072 |
Structure | CNV_10500_DATA_4096 |
Structure | CNV_10500_DATA_8000 |
Structure | EPSF |
Table Types | SYS_CALLST |
Table Types | TR_TABNAMES |
The SAP ERP connector uses configuration parameters to manage table access control. Table access control can be done using the User Profile Authorization Groups method or using a list of table names maintained in a Z Table method.
SAP ERP Connector Compatibility
The SAP ERP connector has been developed and tested for the following SAP ERP Systems:
- SAP ECC 6.0 EHP 7 - 702, 731, 740, 752
- SAP S/4 HANA - 1503, 1511, 1605, 1610, 1709, 1809, 1909
The SAP ERP connector supports the following Incorta specific functionality:
Feature | Supported |
---|---|
Chunking | ✔ |
Data Agent | |
Encryption at Ingest | |
Incremental Load | ✔ |
Multi-Source | ✔ |
OAuth | |
Performance Optimized | ✔ |
Remote | |
Single-Source | ✔ |
Spark Extraction | |
Webhook Callbacks | ✔ |
Deployment of the SAP ERP Connector in SAP
This section provides step by step information for importing binary transports containing Incorta’s development objects into the SAP instance. It is intended for SAP BASIS administrators or Incorta schema administrators with proper SAP transactions and SAP transport management access rights.
Installation User Profile
The following authorization objects are required for the SAP User in order to complete the import steps:
SAP Transaction Code | SAP Transaction Description | Authorization Object |
---|---|---|
CG3Z | File Upload | S_DATASET |
STMS | Transport Management System | S_TRANSPRT, S_CTS_ADMI |
SE10 | Transport Organizer | S_TRANSPRT, S_DATASET |
STMS_IMPORT | Transport Request Import | ES_TRANSPRT, S_CTS_ADMI |
AL11 | Display SAP Directories | S_ADMI_FCD |
FILE | Cross-Client File Names/Paths | |
SE16N | General Table Display | |
SM30 | Maintain Table View | |
SE11 | ABAP Dictionary Maintenance | S_DEVELOP |
SU01 | User Maintenance | S_USER_GRP |
SU53 | Check User Authorization | |
Additional | User Interface | S_GUI |
SM69 | Operation System Commands |
Extract User Profile
The extract user, INC_RFC, will be created as a communications data user type. The following authorization objects are required for the SAP User in order to complete the data extraction from Incorta:
Authorization Object | Authorization Field | Authorization Value |
---|---|---|
S_RFC | RFC_TYPE | FUGR, FUNC |
S_\RFC | RFC_NAME | RFCPING, SYST, ZINC_INTF_REL, ZINC_SAP_ERP_DATA_EXTRACTOR, RFC1, RFCPING, RFC_GET_FUNCTION_INTERFACE, DDIF_FIELDINF_GET, SDIFRUNTIME |
S_RFC | TACTVT | 16 - Execute |
S_TCODE | TCD | SE16, ZSE16 |
S_BTCH_ADM | BTCADMIN | A - Optional Check for Class A Jobs B - Optional Check for Class B Jobs C - Copy Another User’s Jobs D - Authorization to Display Jobs from Other Clients N - No Administrator Authorization P - Authorization to Create Periodic Jobs Y - Background Administrator Authorization |
S_BTCH_JOB | JOBACTION | RELE |
S_BTCH_JOB | JOBGROUP | * |
S_DATASET | PROGRAM | SAPLZINC_INTF_REL |
S_DATASET | AACTVT | 06, 33, 34 |
S_DATASET | FILENAME | * |
S_LOG_COM | COMMAND | ZINC_CHMOD, ZINC_CHOWN, ZINC_COMPRESS, ZINC_TRANS_CF, ZINC_TRANS_SCP, ZINC_WINSCP_CMD |
S_LOG_COM | OPSYSTEM | Linux, NT, Unix, Windows NT |
S_LOG_COM | HOST | * |
S_TABU_DIS | DICBERCLS | <Table_Authorization_Group> |
S_TABU_DIS | ACTVT | 03 - Display |
S_TABU_NAM | ACTVT | 03 - Display |
S_TABU_NAM | TABLE | <Table_Name> |
S_TABU_CLI | CLIIDMAINT | X – If the same user is used for table maintenance. |
- <Table_Name> represents a complete list of table names. Wild card characters are not supported.
- <Table_Authorization_Group> represents a complete list of authorization group names. Wild card characters are not supported.
SAP Function Module Installation Steps
The steps to install the SAP function module are as follows, with details for each step in the sections below:
- Copy the transport (CO and DATA) files and add the request to the SAP application server
- Import the transport request
- Configure the Function Module
- Setup and configure the Data Extract User
- Verify the Function Module setup
Setup the SAP Function Module Transport Request
Download the Installation Files
Contact Incorta Support to obtain the SAP ERP Connector binary transport ZIP file. The ZIP file will contain two files:
- CO file: The filename is in the format KXXXXXX.YYY
- DATA file: The filename is in the format RXXXXXX.YYY
XXXXXX is a six digit number and YYY is the source system ID.
Upload the Transport Files
Upload the CO and DATA transport files to the target application server.
Upload the CO File
Execute the SAP Transaction Code CG3Z in the SAP GUI to upload the CO file to the default CO transport request directory: /usr/sap/trans/cofiles
The Upload File: Parameters dialog requires the following information:
- Source file on front end (e.g.
/Incorta/Release1/Transport_Requests/K900012.ERP
) - Target file on the application server (e.g.
/usr/sap/trans/cofiles/K900012.ERP
Upload the DATA File
Execute the SAP Transaction Code CG3Z in the SAP GUI to upload the DATA file to the default DATA transport request directory: /usr/sap/trans/data
The Upload File: Parameters dialog requires the following information:
- Source file on front end (e.g.
/Incorta/Release1/Transport_Requests/R900012.ERP
) - Target file on the application server (e.g.
/usr/sap/trans/cofiles/R900012.ERP
)
Import the Transport Request
Import the transport request with the following steps:
- Execute the SAP Transaction Code STMS_IMPORT in the SAP GUI to import the transport request for the uploaded files.
- Select the Extras→Other Requests→Add menu option to add a new transport request to the import queue.
- In the Add Transport Request to Import Queue dialog:
- Enter the transport request (e.g. ERPK900012) or select the lookup button to view the available transport request codes.
- Select the green checkbox.
- In the Add Transport Request confirmation dialog, select Yes.
- Once the transport request is added, it is shown in the Import Queue. To import the transport request, select the transport request in the queue, and select Import Request (truck icon).
- In the Import Transport Request dialog
- Enter the Target Client (e.g. 100).
- Select Options, and select the checkbox for each Import Option.
- Select the green checkbox.
- In the Start Import dialog, select Yes.
- Once the transport is imported, the Import Queue Short Text for the request will be populated (e.g. Incorta SAP ERP Extraction connector RFC - RELEASE 1). Select the request to view the Import Log File. All objects should have been created with a valid return code.
Configure the SAP Function Module
Connection Properties
Connection properties are used to define the default settings or appropriate values for the Function Module(s). Connection properties are stored in a key value pair format. The ZINC_CONFIG_PROP table is used to store the connection properties. The table has three columns: Function Module, Property Name and Property Value. Connection properties are stored for a specific Function Module. Execute the SAP Transaction Code SM30 to view or change the connection properties.
The connection properties and the values are development objects. These should be changed only if needed during development or initial integration.
The following are the ZINC_CONFIG_PROP connection properties and the respective default values:
Property Name | Property Description | Property Value |
---|---|---|
ZINC_CP_ASYNC_FILE_CLEAN_UP_METHOD | Enter the file clean up method. Possible values are:
| IMMEDIATE |
ZINC_CP_ASYNC_FILE_TRAN_METHOD | Enter the method used when extracting the data. The default value is ASYNC_FILE_UPLOAD_TO_INCORTA_SERVER_SCP. | ASYNC_FILE_UPLOAD_TO_INCORTA_SERVER_SCP |
ZINC_CP_AUTO_CLEANUP_FLAG | Enable or disable the auto file clean up process. Possible values are Y and N. | Y |
ZINC_CP_BUILD | Enter the date of the last build. | 20201020 |
ZINC_CP_CHARACTER_ENCODING | Enter the character encoding format used for data extraction. The default is utf-16le. | utf-16le |
ZINC_CP_CLEANUP_WINDOW_HOURS | Enter how often the cleanup program should run. The default is 24. | 24 |
ZINC_CP_COMPRESSION_ENABLED | Enable or disable compression. The default value is N. | Y |
ZINC_CP_COMPRESSION_METHOD | Enter the compression method to be used for extracted data. The default is GZIP. | GZIP |
ZINC_CP_COMPRESS_ENCODING | Enter the character encoding format used for data compression. The default is iso-8859-1. | iso-8859-1 |
ZINC_CP_DATE_FORMAT | Enter how the date should be represented in the extraction. The default is YYYYMMDD. | YYYYMMDD |
ZINC_CP_DATE_FORMAT_BW | Enter how the date should be represented in the Business Warehouse (BW) extraction. The default is YYYY.MM.DD. | YYYY.MM.DD |
ZINC_CP_DELETE_LOG | Enable or disable the auto deletion process for log entries. Possible values are Y and N. | Y |
ZINC_CP_DELIMITER_CHARACTERS | Enter the character or characters used for delimiting the columns. The default value is ,. | , |
ZINC_CP_DEL_LOG_PERIOD | Enter the number of days you wish to keep the log entries in the log tables for debugging and/or test purposes. The default is 90. | 90 |
ZINC_CP_ENABLE_PARALLEL_PROCESSING* | Enable or disable parallel processing. The default value is Y. | N |
ZINC_CP_ENABLE_TIMESTAMP_CHUNKING | Enable or disable chunking using the timestamp field. The default value is Y. | Y |
ZINC_CP_ENCLOSING_CHARACTERS | Enter the character or characters used for enclosing the column value when the delimiter is found in the column value. The default is “. | “ |
ZINC_CP_ENCRYPTION_ENABLED | Enter or disable encryption. | |
ZINC_CP_ENCRYPTION_METHOD | Enter the encryption method. | |
ZINC_CP_ESCAPE_CHARACTERS | Enter the character or characters used to escape the delimiter occurrence in the column value. The default is “. | “ |
ZINC_CP_EXTRACTION_METHOD | Enter how the data extraction and transfer will be done. The default is SYNCHRONOUS. | SYNCHRONOUS |
ZINC_CP_EXTRACTION_SELECTION | Enter what types of extractions are allowed in the connector. The default is TABLE. | TABLE |
ZINC_CP_FILE_LAST_CLEANUP_TIMESTAMP | Store the timestamp when the last cleanup occurred for the mirror files. The function module will update the timestamp when the file cleanup program has successfully completed. | 20200406161645.3327320 |
ZINC_CP_FILE_LOCATION | Enter where the extraction file will be stored. The file location is either a logical file path name or actual physical directory path. The default value is the logical file path name created in FILE t-code and the value is ZINC_INCORTA. | ZINC_INCORTA |
ZINC_JOB_SERVER_GRP | Enter the background job server group allocation. The default is blank. | ZINC_JOBSG |
ZINC_CP_LOG_SELECTED_COLUMNS | N | |
ZINC_CP_MIRROR_FILE_ENABLED | Enable file generation for the data extraction. Possible values are N and Y. The default is N. | N |
ZINC_CP_MIRROR_FILE_STORE_HOURS | Enter how many hours the file can be stored on the application server. The default is 8. | 24 |
ZINC_CP_PARALLELPROCESSING_SERVERGROUP* | Enter the server group for parallel processes. | INCORTA |
ZINC_CP_RELEASE | Enter the current version of the connector. | 4.4 |
ZINC_CP_REQUEST_TYPE | Enter what types of requests are allowed in the connector. Possible values are: GET_DATA GET_SCHEMA GET_SCHEMA_LIST GET_TABLE_LIST GET_COLUMN_LIST SUBMIT_REQUEST EXECUTE_REQUEST SUBMIT_REQUEST_ORIG EXECUTE_REQUEST_ORIG The default is GET_DATA. | GET_DATA |
ZINC_CP_ROW_DELIMITER_TYPECODE | Enter the character or characters used for delimiting the rows. The default is CR_LF. | CR_LF |
ZINC_CP_SCHEMA_DISCOVERY_METHOD | Enter what type of table access restriction is allowed for the connection. Supported possible methods are:
| ZINC_AUTH_GROUPS |
ZINC_CP_SCHEMA_TABLE_UPDATES | Update the schema definition log tables with the latest table definitions and when they are executed. The default value is Y. | Y |
ZINC_CP_SELECT_PACKAGE_SIZE | Enter the package size that is used when extracting the data. The default is 15000. | 50000 |
ZINC_CP_STXL_LAST_LOAD_DATE | 00000000 | |
ZINC_CP_TIME_FORMAT | Enter how the time should be represented in the extraction. The default is HHMMSS. | HHMMSS |
ZINC_CP_TRANSPORT_ENTRIES | Enter the request number. | S4HK900246 |
ZINC_CP_UNCOMPRESSED_DATA_PACKET_SIZE_MB | Enter the package size in MB used when extracting the data in Uncompressed Mode. The default value is 100. | 100 |
*These properties are available only for S4/Hana. The rest are available for all SAP versions.
Update the Schema Discovery Method
The following steps can be used to update the Schema Discovery Method:
- Execute the SAP Transaction Code SM30 in the SAP GUI.
- Enter the table name ZINC_CONFIG_PROP.
- Select Display.
- Select Edit (pen and glasses icon) to edit the table contents.
- Select Save (floppy disk icon).
Setup and Configure the SAP Data Extraction User
Create the Data Extraction User
Execute SAP Transaction Code SU01 to create the INC_DE_USER user, which is used to access the SAP system for the data extraction.
In the User Maintenance: Initial Screen dialog, enter the user ID, INC_DE_USER, in User.
Select create (paper icon).
Enter the following Person properties for the Incorta Data Extract User:
- Last name: INCORTA
- First name: DATA EXTRACT USER
Select Logon Data and select a User Type of Communications Data.
NoteIf the same user will be used to perform table maintenance or log into the SAP GUI, then select a User Type of Dialog.
Select the Defaults tab to change the defaults as necessary.
Select save (floppy disk icon).
Create the User Role and Assignment
Configure the Authorization Objects, Fields and Values for the INC_DE_USER.
Data Extraction Folder
Configure the Physical Folder
To configure the physical folder, perform the following steps:
- Execute the SAP Transaction Code FILE.
- Search for the logical file path: ZINC_INCORTA
- Select the Assignment of Physical Paths to Logical Path option.
- For the Physical Paths, choose the type of operating system (OS) used for the application server.
- Double click the OS Name.
- Enter the physical path to the folder (e.g.
tmp/<FILENAME>
).
- The physical path can be changed as desired.
- At least 100GB is needed for auto growth in support of a full load.
- The remove file feature has been implemented, which will clean up the folder automatically.
Setup the Logical Folder Content Display
To setup the logical content folder display, perform the following steps:
- Execute the SAP Transaction Code AL11.
- Select the Configure User Directories icon to create a new mapping.
- In the User-Defined Directories dialog, enter the following information:
- Name of Directory Parameter: DIR_INCORTA
- Directory Name:
/usr/sap/INCORTA_SAPDATAOUTPUT
- Valid for Server Name: all
- Select Save (floppy disk icon).
Verify the SAP Function Module Setup
To verify the SAP Function Module, perform the following steps:
- Execute the SAP Transaction Code SE37.
- In the Function Builder: Initial Screen dialog, enter the name of the main function module, ZINC_SAP_ERP_DATA_EXTRACTOR.
- Select the Test/Execute icon.
Get Schema Request
- In the Test Function Module: Initial Screen dialog, enter the following import parameter:
- IN_REQUEST_TYPE: GET_SCHEMA_LIST
- Select the Execute icon.
- The output for GET_SCHEMA_LIST is captured in the OUT_SCHEMA_DEFINITION table. Select the OUT_SCHEMA_DEFINITION entries to see the contents of the table.
Get Data – Master or Lookup Table
- In the Test Function Module: Initial Screen dialog, enter the following import parameters:
- IN_REQUEST_TYPE: GET_DATA
- IN_EXTRACTION_SELECTION: TABLE
- IN_SELECTED_TABLE_NAME: T024
- Select the Execute icon.
- The output for GET_DATA is captured in the output table specified in the OUT_DYNAMIC_TABLE_SELECTED output parameter. For the selected table, T024, the output table is OUT_DYNAMIC_TABLE_WIDTH_1024.
- Select the output table entries to see the contents of the extracted data.
Get Data – Transaction Table Stand Alone
- In the Test Function Module: Initial Screen dialog, enter the following import parameters:
- IN_REQUEST_TYPE: GET_DATA
- IN_EXTRACTION_SELECTION: TABLE
- IN_SELECTED_TABLE_NAME: BKPF
- IN_SELECTED_TAB_FULLINC_FILTER: BKPF~CPUDT >= '20150101' AND BKPF~CPUDT <= '20150131'
- Select the Execute icon.
- The output for GET_DATA is captured in the output table specified in the OUT_DYNAMIC_TABLE_SELECTED output parameter. For the selected table, BKPF, the output table is OUT_DYNAMIC_TABLE_WIDTH_2048.
- Select the output table entries to see the contents of the extracted data.
Get Data – Transaction Table with Parent Table
- In the Test Function Module: Initial Screen dialog, enter the following import parameters:
- IN_REQUEST_TYPE: GET_DATA
- IN_EXTRACTION_SELECTION: TABLE
- IN_SELECTED_TABLE_NAME: BSEG
- IN_PARENT_TABLE_NAME: BKPF
- IN_PARENT_TABLE_JOIN_CONDT: BKPF~BUKRS = BSEG~BUKRS AND BKPF~BELNR = BSEG~BELNR AND BKPF~GJAHR = BSEG~GJAHR
- IN_SELECTED_TAB_FULLINC_FILTER: BKPF~CPUDT >= '20150101' AND BKPF~CPUDT <= '20150131'
- Select the Execute icon.
- The output for GET_DATA is captured in the output table specified in the OUT_DYNAMIC_TABLE_SELECTED output parameter. For the selected table BSEG, the output table is OUT_DYNAMIC_TABLE_WIDTH_5000.
- Select the output table entries to see the contents of the extracted data.
Steps to connect SAP ERP and Incorta
To connect SAP ERP and Incorta, here are the high level steps, tools, and procedures:
- Create an external data source
- Create a schema with the Schema Wizard
- or, Create a schema with the Schema Designer
- Load the schema
- Explore the schema
Create an external data source
Here are the steps to create a external data source with the SAP ERP connector:
- Sign in to the Incorta Direct Data Platform.
- In the Navigation bar, select Data.
- In the Action bar, select + New → Add Data Source.
- In the Choose a Data Source dialog, in Other, select SAP ERP.
- In the New Data Source dialog, specify the applicable connector properties.
- To test, select Test Connection.
- Select Ok to save your changes.
SAP ERP connector properties
Here are the properties for the SAP ERP connector:
Property | Control | Description |
---|---|---|
Data Source Name | text box | Enter the name of the data source |
SAP Application Server Host | text box | Enter the SAP application server host |
SAP System Number | text box | Enter the SAP system number. The value is between 00 and 99. |
SAP Client | text box | Enter the SAP client number. This number is used to perform the RFC logon to SAP, binding the agent process with the account created during the transport import. |
User ID | text box | Enter the SAP user ID |
Password | text box | Enter the SAP password |
BAPI Name | text box | Optionally enter the Business Application Programming Interface (BAPI) name. The default is ZINC\_SAP\_ERP\_DATA\_EXTRACTOR . |
Peak Limit | text box | Optionally enter the maximum number of active connections that can be created for a destination simultaneously. The default is 10. |
Pool Capacity | text box | Optionally enter the maximum number of idle connections kept open by the destination. The default is 3. |
Metadata Cache Time in Minutes | text box | Optionally enter the metadata cache time in minutes. The default is 60. |
Extraction Method | drop down list | Select the extraction method. The options are:
|
Process Data At Incorta | toggle | Enable this property to process data at Incorta |
Keep File For Debugging | toggle | Enable this property to save the file for debugging |
Enable Encryption | toggle | Enable this property for data encryption |
Asynchronous File Transfer Method | drop down list | Select the upload method. The options are: Upload to Incorta Server using SCP Upload to Incorta Server using WinSCP Upload to Incorta Shared Drive Upload to Common Shared Drive |
Destination Host | text box | Enter the destination host IP address to transfer files from SAP to Incorta. |
Destination Authentication Method | drop down list | Enter the authentication method. The options are: ● Private Key ● Password |
Destination Username | text box | Enter the destination username of Incorta machine. The default is incorta . |
Destination Path | text box | Enter the destination path. The default is /home/incorta/ASYNC_DATAEXTRACTS/SAPS4HANAIDES |
Use Data Agent | toggle | Enable using a data agent to securely ingest data from an external data source that is behind a firewall. For more information, please review Tools → Data Agent and Tools → Data Manager. |
Data Agent | drop down list | Enable Use Data Agent to configure this property. Select from the data agents created in the tenant, if any. |
Create a schema with the Schema Wizard
Here are the steps to create an SAP ERP schema with the Schema Wizard:
- Sign in to the Incorta Direct Data Platform.
- In the Navigation bar, select Schema.
- In the Action bar, select + New → Schema Wizard.
- In (1) Choose a Source, specify the following:
- For Enter a name, enter the schema name.
- For Select a Datasource, select the SAP ERP external data source.
- Optionally create a description.
- In the Schema Wizard footer, select Next.
- In (2) Manage Tables, in the Data Panel, first select the name of the Data Source, and then check the Select All checkbox.
- In the Schema Wizard footer, select Next.
- In (3) Finalize, in the Schema Wizard footer, select Create Schema.
Create a schema with the Schema Designer
Here are the steps to create an SAP ERP schema using the Schema Designer:
- Sign in to the Incorta Direct Data Platform.
- In the Navigation bar, select Schema.
- In the Action bar, select + New → Create Schema.
- In Name, specify the schema name, and select Save.
- In Start adding tables to your schema, select SAP.
- In the Data Source dialog, specify the SAP ERP table data source properties.
- Select Add.
- In the Table Editor, in the Table Summary section, enter the table name.
- To save your changes, select Done in the Action bar.
SAP ERP table data source properties
For a schema table in Incorta, you can define the following SAP ERP specific data source properties as follows:
Property | Control | Description
Property | Control | Description
---------|---------|-------------
Type | drop down list | Default is SQL Database
Data Source | drop down list | Select the SAP ERP external data source
Schema | drop down list | Select the SAP schema
Table | drop down list | Select the SAP table
Derived Columns | text box | Enter the table derived columns
Selected Columns | text box | Enter the source table columns you would like to include. Use a comma separator between column names with no spaces. For example, MATNR,KOKRS,BELNR
. Leave this field empty if you need to include all the source table columns.
Extraction Filter | text box | Enter the filter to apply to the data extracted from the source table. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, COSP~GJAHR = 2021
.
Full Load Setup | toggle | Enable this property to set up the full load details
Full Load Derived Columns | text box | Enable Full Load Setup to configure this property. Define constant value columns. This is useful in cases where the number of full load columns is less than the number of incremental load columns.
Dependent Table | toggle | Enable Full Load Setup to configure this property. Enable this property to specify a dependent table.
Dependent Schema | drop down list | Enable Dependent Table to configure this property. Select the dependent schema.
Dependent Table | drop down list | Enable Dependent Table to configure this property. Select the dependent table.
Dependent Join Type | drop down list | Enable Dependent Table to configure this property. Select the join type between the source table and dependent table. The options are:
- INNER JOIN
- LEFT OUTER JOIN
BKPF~BUKRS = BSEG~BUKRS
.
Dependent Columns | text box | Enable Dependent Table to configure this property. Enter the names of the columns that you would like to include from the dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2
. Leave this field empty if you do not want to include any dependent columns.
Dependent Table Extraction Filter | text box | Enable Dependent Table to configure this property. Enter the filter to be applied to the data extracted from the dependent table. This filter will only be considered when you join a cluster and/or pool table with a transparent table. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, CDHDR~OBJECTCLAS IN ('MAKT','MAKTC')
.
Dependent Table 2 | toggle | Enable Dependent Table to configure this property. Enable this property to specify a second dependent table.
Dependent Schema 2 | drop down list | Enable Dependent Table 2 to configure this property. Select the dependent schema for the second dependent table.
Dependent Table 2 | drop down list | Enable Dependent Table 2 to configure this property. Select the second dependent table.
Dependent Table 2 Join Type | drop down list | Enable Dependent Table 2 to configure this property. Select the join type between the source table and the second dependent table. The options are:- INNER JOIN
- LEFT OUTER JOIN
BKPF~BUKRS = BSEG~BUKRS
.
Dependent Table 2 Columns | text box | Enable Dependent Table 2 to configure this property. Enter the names of the columns that you would like to include from the second dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2
. Leave this field empty if you do not want to include any dependent columns.
Dependent Table 3 | toggle | Enable Dependent Table 2 to configure this property. Enable this property to specify a third dependent table.
Dependent Schema 3 | drop down list | Enable Dependent Table 3 to configure this property. Select the dependent schema for the third dependent table.
Dependent Table 3 | drop down list | Enable Dependent Table 3 to configure this property. Select the third dependent table.
Dependent Table 3 Join Type | drop down list | Enable Dependent Table 3 to configure this property. Select the join type between the source table and the third dependent table. The options are:- INNER JOIN
- LEFT OUTER JOIN
BKPF~BUKRS = BSEG~BUKRS
.
Dependent Table 3 Columns | text box | Enable Dependent Table 3 to configure this property. Enter the names of the columns that you would like to include from the third dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2
. Leave this field empty if you do not want to include any dependent columns.
Additional Full Load Extraction Filter | text box | Enter an additional filter to apply to the data extracted from the source table during a full load. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, COSP~WRTTP IN ('E','V')
.
Full load Fetch Method | drop down list | Select the backend database fetch method to use when executing a full load query against the SAP Function Module. The options are:- OPEN
- _
- SQL (default)
- OPEN
- _
- CURSOR
- Chunking by date or timestamp columns
- Chunking by string or number column
COSP~GJAHR = 2021 AND (COSP~PERBL > @@GV_PERIOD OR COSP~BUDGET_PD > @@GV_PERIOD)
.
Join Condition | text box | Enable Dependent Table to configure this property. Enter the join condition between the source table and dependent table.
Incremental Load | toggle | Enable the incremental load configuration for the schema table
Incremental Load Type | drop down list | Enable Incremental to configure this property. Select the incremental load type. The options are:- Incremental Load by date or timestamp columns
- Incremental Load by Static Filter
- Chunking by string or number columns
- INNER JOIN
- LEFT OUTER JOIN
BKPF~BUKRS = BSEG~BUKRS
.
Incremental Dependent Columns | text box | Enable Incremental Dependent Table to configure this property. Enter the names of the columns that you would like to include from the incremental dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2
. Leave this field empty if you do not want to include any incremental dependent columns.
Incremental Dependent Table Extraction Filter | text box | Enable Incremental Dependent Table to configure this property. Enter the filter to be applied to the data extracted from the incremental dependent table. This filter will only be considered when you join a cluster and/or pool table with a transparent table. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, CDHDR~OBJECTCLAS IN ('MAKT','MAKTC')
.
Incremental Dependent Table 2 | toggle | Enable Incremental Dependent Table to configure this property. Enable this property to specify a second incremental dependent table.
Incremental Dependent Schema 2 | drop down list | Enable Incremental Dependent Table 2 to configure this property. Select the dependent schema for the second incremental dependent table.
Incremental Dependent Table 2 | drop down list | Enable Incremental Dependent Table 2 to configure this property. Select the second incremental dependent table.
Incremental Dependent Table 2 Join Type | drop down list | Enable Incremental Dependent Table 2 to configure this property. Select the join type between the source table and the second incremental dependent table. The options are:- INNER JOIN
- LEFT OUTER JOIN
BKPF~BUKRS = BSEG~BUKRS
.
Incremental Dependent Table 2 Columns | text box | Enable Incremental Dependent Table 2 to configure this property. Enter the names of the columns that you would like to include from the second dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2
. Leave this field empty if you do not want to include any dependent columns.
Incremental Dependent Table 3 | toggle | Enable Incremental Dependent Table 2 to configure this property. Enable this property to specify a third incrementaldependent table.
Incremental Dependent Schema 3 | drop down list | Enable Incremental Dependent Table 3 to configure this property. Select the dependent schema for the third incremental dependent table.
Incremental Dependent Table 3 | drop down list | Enable Incremental Dependent Table 3 to configure this property. Select the third incremental dependent table.
Incremental Dependent Table 3 Join Type | drop down list | Enable Incremental Dependent Table 3 to configure this property. Select the join type between the source table and the third incremental dependent table. The options are:- INNER JOIN
- LEFT OUTER JOIN
BKPF~BUKRS = BSEG~BUKRS
.
Incremental Dependent Table 3 Columns | text box | Enable Incremental Dependent Table 3 to configure this property. Enter the names of the columns that you would like to include from the third incremental dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2
. Leave this field empty if you do not want to include any dependent columns.
Additional Incremental Load Extraction Filter | text box | Enter an additional filter to apply to the data extracted from the source table during an incremental load. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, COSP~WRTTP IN ('E','V')
.
Incremental load Fetch Method | drop down list | Select the backend database fetch method to use when executing an incremental load query against the SAP Function Module. The options are:- OPEN
- _
- SQL (default)
- OPEN
- _
- CURSOR
- Incremental Load by date or timestamp columns
- Incremental Load by Static Filter
- Chunking by string or number columns
- Chunking by Year and Period columns
- Last Successful Extract Time (
- Incremental Load by date or timestamp column
- only)
- Get Chunking Values Filter (
- Chunking by string or number columns
- only)
- Maximum Value of a Column
View the schema diagram with the Schema Diagram Viewer
Here are the steps to view the schema diagram using the Schema Diagram Viewer:
- Sign in to the Incorta Direct Data Platform.
- In the Navigation bar, select Schema.
- In the list of schemas, select the SAP ERP schema.
- In the Schema Designer, in the Action bar, select Diagram.
Load the schema
Here are the steps to perform a Full Load of the SAP ERP schema using the Schema Designer:
- Sign in to the Incorta Direct Data Platform.
- In the Navigation bar, select Schema.
- In the list of schemas, select the SAP ERP schema.
- In the Schema Designer, in the Action bar, select Load → Load Now → Full.
- To review the load status, in Last Load Status, select the date.
Explore the schema
With the full load of the SAP ERP schema complete, you can use the Analyzer to explore the schema, create your first insight, and save the insight to a new dashboard.
To open the Analyzer from the schema, follow these steps:
- In the Navigation bar, select Schema.
- In the Schema Manager, in the List view, select the SAP ERP schema.
- In the Schema Designer, in the Action bar, select Explore Data.