Connectors → SAP ERP

About SAP ERP

SAP ERP is an enterprise resource planning system that is built for businesses of all sizes. SAP ERP has modules specific to industries, from manufacturing to retail. The software supports and integrates almost every functional area of a business process from procurement of goods and services, sale and distribution, finance, accounting, human resource, manufacturing, production planning to logistics and warehouse management.

SAP ERP Connector Updates

This section is to explore the updates in the newer versions of the SAP ERP connector available on the Incorta connectors marketplace.

In order to get the newer version of the connector, please update the connector using the marketplace.

VersionUpdates
2.0.0.5Ability to Skip RFC data while exporting and importing a tenant to maintain their uniqueness per connection.

About the SAP ERP Connector

The SAP ERP Connector loads data into Incorta from the SAP ERP Tables, Views, Advanced Business Application Programming (ABAP) Core Data Services (CDS) Views.

The Connector is built using the SAP Java Connector (JCo) library, and uses a Remote Function Call (RFC) enabled Function Module program to perform the following activities:

  • Receive Requests
  • Generate Schema Definition
  • Extract Data from Tables
  • Log Extraction and Processing Status
Important

Starting the 2024.1.3 On-Premises release, the SAP ERP Connector no longer bundles the JCo 3 libraries. For more information, refer to the License Terms for SAP Connectors. Accordingly, you must manually get the needed libraries from the SAP Java Connector site and add them to this directory: <INCORTA_HOME>/IncortaNode/extensions/connectors/shared-libs/sapjco3.

The SAP ERP connector uses configuration table ZINC_CONFIG_PROP to manage table access control. Table access control can be done using the User Profile Authorization Groups method ZINC_AUTH_GROUP or using a list of table names maintained in a Z Table method (ZINC_TAB_ACCESS).

SAP ERP Connector Compatibility

The SAP ERP connector has been developed and tested for the following SAP ERP Systems:

  • SAP ECC 6.0 EHP 7 - 700, 702, 731, 740, 752
  • SAP S/4 HANA - 1503, 1511, 1605, 1610, 1709, 1809, 1909

The SAP ERP connector supports the following Incorta specific functionality:

FeatureSupported
Chunking
Data Agent
Encryption at Ingest
Incremental Load
Multi-Source
OAuth
Performance Optimized
Remote
Single-Source
Spark Extraction
Webhook Callbacks
Important

If you are using the data agent with a Windows machine, you must install the Visual C++ Redistributable Packages for Visual Studio 2013.

Deployment of the SAP ERP Connector in SAP

This section provides step by step information for importing binary transports containing Incorta’s development objects into the SAP instance. It is intended for SAP BASIS administrators or Incorta schema administrators with proper SAP transactions and SAP transport management access rights.

Installation User Profile

The following authorization objects are required for the SAP User in order to complete the import steps:

SAP Transaction CodeSAP Transaction DescriptionAuthorization Object
CG3ZFile UploadS_DATASET
STMSTransport Management SystemS_TRANSPRT, S_CTS_ADMI
SE10Transport OrganizerS_TRANSPRT, S_DATASET
STMS_IMPORTTransport Request ImportES_TRANSPRT, S_CTS_ADMI
AL11Display SAP DirectoriesS_ADMI_FCD
FILECross-Client File Names/Paths
SE16NGeneral Table Display
SM30Maintain Table View
SE11ABAP Dictionary MaintenanceS_DEVELOP
SU01User MaintenanceS_USER_GRP
SU53Check User Authorization
AdditionalUser InterfaceS_GUI
SM69Operation System Commands

Extract User Profile

The extract user, INC_DE_USER, will be created as a communications data user type. The following authorization objects are required for the SAP User in order to complete the data extraction from Incorta:

Authorization ObjectAuthorization FieldAuthorization Value
S_RFCRFC_TYPEFUGR, FUNC
S_RFCRFC_NAMERFCPING, SYST, ZINC_INTF_REL, ZINC_SAP_ERP_DATA_EXTRACTOR, RFC1, RFCPING, RFC_GET_FUNCTION_INTERFACE, DDIF_FIELDINF_GET, SDIFRUNTIME, RFC_METADATA_GET
S_RFCTACTVT16 - Execute
S_TCODETCDSE16
S_BTCH_ADMBTCADMINA - Optional Check for Class A Jobs
B - Optional Check for Class B Jobs
C - Copy Another User’s Jobs
D - Authorization to Display Jobs from Other Clients
N - No Administrator Authorization
P - Authorization to Create Periodic Jobs
Y - Background Administrator Authorization
S_BTCH_JOBJOBACTIONRELE
S_BTCH_JOBJOBGROUP*
S_TABU_DISDICBERCLS<Table_Authorization_Group>
S_TABU_DISACTVT03 - Display
S_TABU_NAMACTVT03 - Display
S_TABU_NAMTABLE<Table_Name>
Note
  • <Table_Name> represents a complete list of table names. Wild card characters are not supported.
  • <Table_Authorization_Group> represents a complete list of authorization group names. Wild card characters are not supported.

SAP Function Module Installation Steps

The steps to install the SAP function module are as follows, with details for each step in the sections below:

Setup the SAP Function Module Transport Request

Download the Installation Files

Contact Incorta Support to obtain the SAP ERP Connector binary transport ZIP file. The ZIP file will contain two files:

  • CO file: The filename is in the format KXXXXXX.YYY
  • DATA file: The filename is in the format RXXXXXX.YYY
Note

XXXXXX is a six digit number and YYY is the source system ID.

Upload the Transport Files

Upload the CO and DATA transport files to the target application server.

Upload the CO File

Execute the SAP Transaction Code CG3Z in the SAP GUI to upload the CO file to the default CO transport request directory: /usr/sap/trans/cofiles

The Upload File: Parameters dialog requires the following information:

  • Source file on front end (e.g. /Incorta/Release1/Transport_Requests/K900012.ERP)
  • Target file on the application server (e.g. /usr/sap/trans/cofiles/K900012.ERP)

Upload the DATA File

Execute the SAP Transaction Code CG3Z in the SAP GUI to upload the DATA file to the default DATA transport request directory: /usr/sap/trans/data

The Upload File: Parameters dialog requires the following information:

  • Source file on front end (e.g. /Incorta/Release1/Transport_Requests/R900012.ERP)
  • Target file on the application server (e.g. /usr/sap/trans/cofiles/R900012.ERP)

Import the Transport Request

Import the transport request with the following steps:

  • Execute the SAP Transaction Code STMS_IMPORT in the SAP GUI to import the transport request for the uploaded files.
  • Select the ExtrasOther RequestsAdd menu option to add a new transport request to the import queue.
  • In the Add Transport Request to Import Queue dialog:
    • Enter the transport request (e.g. ERPK900012) or select the lookup button to view the available transport request codes.
    • Select the green checkbox.
  • In the Add Transport Request confirmation dialog, select Yes.
  • Once the transport request is added, it is shown in the Import Queue. To import the transport request, select the transport request in the queue, and select Import Request (truck icon).
  • In the Import Transport Request dialog
    • Enter the Target Client (e.g. 100).
    • Select Options, and select the checkbox for each Import Option.
    • Select the green checkbox.
  • In the Start Import dialog, select Yes.
  • Once the transport is imported, the Import Queue Short Text for the request will be populated (e.g. Incorta SAP ERP Extraction connector RFC - RELEASE 1). Select the request to view the Import Log File. All objects should have been created with a valid return code.

Configure the SAP Function Module

Connection Properties

Connection properties are used to define the default settings or appropriate values for the Function Module(s). Connection properties are stored in a key value pair format. The ZINC_CONFIG_PROP table is used to store the connection properties. The table has three columns: Function Module, Property Name and Property Value. Connection properties are stored for a specific Function Module. Execute the SAP Transaction Code SM30 to view or change the connection properties.

Note

The connection properties and the values are development objects. These should be changed only if needed during development or initial integration.

The following are the ZINC_CONFIG_PROP connection properties and the respective default values:

Property NameProperty DescriptionProperty Value
ZINC_CP_ASYNC_FILE_CLEAN_UP_METHODEnter the file clean up method. Possible values are:
  • IMMEDIATE - Remove the data file, schema definition file and control file, if any, immediately
  • SCHEDULE - Clean the file up manually during testing
IMMEDIATE
ZINC_CP_ASYNC_FILE_TRAN_METHODEnter the method used when extracting the data. The default value is ASYNC_FILE_UPLOAD_TO_INCORTA_SERVER_SCP.ASYNC_FILE_UPLOAD_TO_INCORTA_SERVER_SCP
ZINC_CP_AUTO_CLEANUP_FLAGEnable or disable the auto file clean up process. Possible values are Y and N.Y
ZINC_CP_BUILDEnter the date of the last build.20201020
ZINC_CP_CHARACTER_ENCODINGEnter the character encoding format used for data extraction. The default is utf-16le.utf-16le
ZINC_CP_CLEANUP_WINDOW_HOURSEnter how often the cleanup program should run. The default is 24.24
ZINC_CP_COMPRESSION_ENABLEDEnable or disable compression. The default value is N.Y
ZINC_CP_COMPRESSION_METHODEnter the compression method to be used for extracted data. The default is GZIP.GZIP
ZINC_CP_COMPRESS_ENCODINGEnter the character encoding format used for data compression. The default is iso-8859-1.iso-8859-1
ZINC_CP_DATE_FORMATEnter how the date should be represented in the extraction. The default is YYYYMMDD.YYYYMMDD
ZINC_CP_DATE_FORMAT_BWEnter how the date should be represented in the Business Warehouse (BW) extraction. The default is YYYY.MM.DD.YYYY.MM.DD
ZINC_CP_DELETE_LOGEnable or disable the auto deletion process for log entries. Possible values are Y and N.Y
ZINC_CP_DELIMITER_CHARACTERSEnter the character or characters used for delimiting the columns. The default value is ,.,
ZINC_CP_DEL_LOG_PERIODEnter the number of days you wish to keep the log entries in the log tables for debugging and/or test purposes. The default is 90.90
ZINC_CP_ENABLE_PARALLEL_PROCESSING*Enable or disable parallel processing. The default value is Y.N
ZINC_CP_ENABLE_TIMESTAMP_CHUNKINGEnable or disable chunking using the timestamp field. The default value is Y.Y
ZINC_CP_ENCLOSING_CHARACTERSEnter the character or characters used for enclosing the column value when the delimiter is found in the column value. The default is “.
ZINC_CP_ENCRYPTION_ENABLEDEnter or disable encryption.
ZINC_CP_ENCRYPTION_METHODEnter the encryption method.
ZINC_CP_ESCAPE_CHARACTERSEnter the character or characters used to escape the delimiter occurrence in the column value. The default is “.
ZINC_CP_EXTRACTION_METHODEnter how the data extraction and transfer will be done. The default is SYNCHRONOUS.SYNCHRONOUS
ZINC_CP_EXTRACTION_SELECTIONEnter what types of extractions are allowed in the connector. The default is TABLE.TABLE
ZINC_CP_FILE_LAST_CLEANUP_TIMESTAMPStore the timestamp when the last cleanup occurred for the mirror files. The function module will update the timestamp when the file cleanup program has successfully completed.20200406161645.3327320
ZINC_CP_FILE_LOCATIONEnter where the extraction file will be stored. The file location is either a logical file path name or actual physical directory path. The default value is the logical file path name created in FILE t-code and the value is ZINC_INCORTA.ZINC_INCORTA
ZINC_JOB_SERVER_GRPEnter the background job server group allocation. The default is blank.
Note: Job Server group is needed only when the are multiple application server instance. If they run on single application server, this setting is not needed.
ZINC_JOBSG
ZINC_CP_LOG_SELECTED_COLUMNSN
ZINC_CP_MIRROR_FILE_ENABLEDEnable file generation for the data extraction. Possible values are N and Y. The default is N.N
ZINC_CP_MIRROR_FILE_STORE_HOURSEnter how many hours the file can be stored on the application server. The default is 8.24
ZINC_CP_PARALLELPROCESSING_SERVERGROUP*Enter the server group for parallel processes.390 (Default parallel server group)
ZINC_CP_RELEASEEnter the current version of the connector.4.4
ZINC_CP_REQUEST_TYPEEnter what types of requests are allowed in the connector. Possible values are:
GET_DATA
GET_SCHEMA
GET_SCHEMA_LIST
GET_TABLE_LIST
GET_COLUMN_LIST
SUBMIT_REQUEST
EXECUTE_REQUEST
SUBMIT_REQUEST_ORIG
EXECUTE_REQUEST_ORIG

The default is GET_DATA.
GET_DATA
ZINC_CP_ROW_DELIMITER_TYPECODEEnter the character or characters used for delimiting the rows. The default is CR_LF.CR_LF
ZINC_CP_SCHEMA_DISCOVERY_METHODEnter what type of table access restriction is allowed for the connection. Supported possible methods are:
  ●  User Profile Authorization Groups – ZINC_AUTH_GROUPS
  ●  Custom Z Table List – ZINC_TAB_ACCESS
ZINC_AUTH_GROUP,
ZINC_TAB_ACCESS
ZINC_CP_SCHEMA_TABLE_UPDATESUpdate the schema definition log tables with the latest table definitions and when they are executed. The default value is Y.Y
ZINC_CP_SELECT_PACKAGE_SIZEEnter the package size that is used when extracting the data. The default is 15000.50000
ZINC_CP_STXL_LAST_LOAD_DATE00000000
ZINC_CP_TIME_FORMATEnter how the time should be represented in the extraction. The default is HHMMSS.HHMMSS
ZINC_CP_TRANSPORT_ENTRIESEnter the request number.S4HK900246
ZINC_CP_UNCOMPRESSED_DATA_PACKET_SIZE_MBEnter the package size in MB used when extracting the data in Uncompressed Mode. The default value is 100.100

*These properties are available only for S4/Hana. The rest are available for all SAP versions.

Update the Schema Discovery Method

The following steps can be used to update the Schema Discovery Method:

  • Execute the SAP Transaction Code SM30 in the SAP GUI.
  • Enter the table name ZINC_CONFIG_PROP.
  • Select Display.
  • Select Edit (pen and glasses icon) to edit the table contents.
  • Select Save (floppy disk icon).

Setup and Configure the SAP Data Extraction User

Create the Data Extraction User

Execute SAP Transaction Code SU01 to create the INC_DE_USER user, which is used to access the SAP system for the data extraction.

  • In the User Maintenance: Initial Screen dialog, enter the user ID, INC_DE_USER, in User.

  • Select create (paper icon).

  • Enter the following Person properties for the Incorta Data Extract User:

    • Last name: INCORTA
    • First name: DATA EXTRACT USER
  • Select Logon Data and select a User Type of Communications Data.

    Note

    If the same user will be used to perform table maintenance or log into the SAP GUI, then select a User Type of Dialog.

  • Select the Defaults tab to change the defaults as necessary.

  • Select save (floppy disk icon).

Create the User Role and Assignment

Configure the Authorization Objects, Fields and Values for the INC_DE_USER.

Verify the SAP Function Module Setup

To verify the SAP Function Module, perform the following steps:

  • Execute the SAP Transaction Code SE37.
  • In the Function Builder: Initial Screen dialog, enter the name of the main function module, ZINC_SAP_ERP_DATA_EXTRACTOR.
  • Select the Test/Execute icon.

Get Schema Request

  • In the Test Function Module: Initial Screen dialog, enter the following import parameter:
    • IN_REQUEST_TYPE: GET_SCHEMA_LIST
  • Select the Execute icon.
  • The output for GET_SCHEMA_LIST is captured in the OUT_SCHEMA_DEFINITION table. Select the OUT_SCHEMA_DEFINITION entries to see the contents of the table.

Get Data – Master or Lookup Table

  • In the Test Function Module: Initial Screen dialog, enter the following import parameters:
    • IN_REQUEST_TYPE: GET_DATA
    • IN_EXTRACTION_SELECTION: TABLE
    • IN_SELECTED_TABLE_NAME: T024
  • Select the Execute icon.
  • The output for GET_DATA is captured in the output table specified in the OUT_DYNAMIC_TABLE_SELECTED output parameter. For the selected table, T024, the output table is OUT_DYNAMIC_TABLE_WIDTH_1024.
  • Select the output table entries to see the contents of the extracted data.

Get Data – Transaction Table Stand Alone

  • In the Test Function Module: Initial Screen dialog, enter the following import parameters:
    • IN_REQUEST_TYPE: GET_DATA
    • IN_EXTRACTION_SELECTION: TABLE
    • IN_SELECTED_TABLE_NAME: BKPF
    • IN_SELECTED_TAB_FULLINC_FILTER: BKPF~CPUDT >= '20150101' AND BKPF~CPUDT <= '20150131'
  • Select the Execute icon.
  • The output for GET_DATA is captured in the output table specified in the OUT_DYNAMIC_TABLE_SELECTED output parameter. For the selected table, BKPF, the output table is OUT_DYNAMIC_TABLE_WIDTH_2048.
  • Select the output table entries to see the contents of the extracted data.

Get Data – Transaction Table with Parent Table

  • In the Test Function Module: Initial Screen dialog, enter the following import parameters:
    • IN_REQUEST_TYPE: GET_DATA
    • IN_EXTRACTION_SELECTION: TABLE
    • IN_SELECTED_TABLE_NAME: BSEG
    • IN_PARENT_TABLE_NAME: BKPF
    • IN_PARENT_TABLE_JOIN_CONDT: BKPF~BUKRS = BSEG~BUKRS AND BKPF~BELNR = BSEG~BELNR AND BKPF~GJAHR = BSEG~GJAHR
    • IN_SELECTED_TAB_FULLINC_FILTER: BKPF~CPUDT >= '20150101' AND BKPF~CPUDT <= '20150131'
  • Select the Execute icon.
  • The output for GET_DATA is captured in the output table specified in the OUT_DYNAMIC_TABLE_SELECTED output parameter. For the selected table BSEG, the output table is OUT_DYNAMIC_TABLE_WIDTH_5000.
  • Select the output table entries to see the contents of the extracted data.

Additional configurations to support Callback

Update the role of existing extraction user

Add access to additional function module by adding the following function module to the S_RFC authorization object.

Authorization ObjectAuthorization FieldAuthorization Value
S_RFCRFC_NAMERFC_METADATA_GET
Note: Make sure to keep already existing values.

Create RFC Destination and Whitelist Incorta

To configure the callback extraction mode, create an RFC destination in your SAP system.

Important

The RFC destination must be unique per Incorta data source when connecting to the same SAP source. For example, if you need to connect both Incorta development and production environments to the same SAP data source, you must create two destinations to the SAP environment, one for each.

Create a new destination:

  • Select transaction code SM59.
  • Select the create icon.
  • In the create window enter the following:
    • RFC destination name
    • Description
  • In Technical settings tab:
    • Activation type: “Registered Server Program”
    • Program ID for registered server program
    • Gateway host – application server name
    • Gateway service
  • In the Unicode tab:
    • Select the Unicode tab.
    • Select the communication type as Unicode.
    • Select Save.

Create a background server group

Target only one specific application server instance to use in Incorta, if you have multiple, to run the background jobs.

Create a background job as follows in the transaction code SM61:

  • Select “Job Server Groups”.
  • Select the create Group button.
  • Enter the group name and select Continue.

Add the application server instance to the job server group:

  • Select the job server group you created.
  • Select Add Assignment.
  • Select the application server name from the list of instances available.
  • Select Continue.

Configure the job server group created in the Incorta Configuration table:

  • Select the transaction code SM30.
  • In the table name, enter the table name as ZINC\_CONFIG\_PROP, and then select Maintain.
  • Find the property ZINC\_CP\_JOB\_SERVER\_GRP, assign the job server group name you created.
  • Select Save.

Validate profile parameters

For the RFC callback connection to function properly, you must ensure that the following parameters are set in the transaction code RZ11, knowing that the parameters value should be 1 or above.

  • gw/acl_mode
  • rfc/callback_security_method
  • gw/reg_no_conn_info

Whitelist RFC callback

You must whitelist Incorta IP address on your SAP system to be able to call the Incorta server and send the data back.

Note

If you are using data agent, whitelist the data agent IP address instead of the Incorta server IP.

You must maintain this information in the Security and Registration info files. To verify:

  • Go to transaction code RZ11.
  • To view the Registration info file, enter profile name as gw/reg_info, then select Display.
  • To view the Security info file, enter profile name as gw/sec_info, then select Display.

If the SAP Kernel version is 753 or above, you can do the whitelisting from the SAP Server itself.

Security info file

  • Navigate to the transaction code SMGW.
  • In the menu bar, select Goto > Expert Functions > External Security > Maintain ACL Files.
  • In the Secinfo File tab, select Create > Standard.
  • Enter the following details:
    • P/D - P to provide permission
    • TP - the JCO Program name entered in the RFC destination
    • User - the Incorta data extraction username
    • Host – Incorta Analytics hostname or IP address
    • User-Host - Enter * to provide full access
  • Select done, then Save.

Registration info file

  • In the Reginfo File tab, select Create > Standard.
  • Enter the following details:
    • P/D - P to provide permission
    • TP - the JCO Program name entered in the RFC destination
    • Host – Incorta Analytics hostname or IP address
    • Access - *
    • Cancel - *
  • Select done, then Save.
Note

Re-read the files by selecting Goto > Expert Functions > External Security > Re-Read NI ACL.


Repeat for Re-Read NI ACL Globally.

Steps to connect SAP ERP and Incorta

To connect SAP ERP and Incorta, here are the high level steps, tools, and procedures:

Create an external data source

Here are the steps to create a external data source with the SAP ERP connector:

  • Sign in to the Incorta Direct Data Platform.
  • In the Navigation bar, select Data.
  • In the Action bar, select + NewAdd Data Source.
  • In the Choose a Data Source dialog, in Other, select SAP ERP.
  • In the New Data Source dialog, specify the applicable connector properties.
  • To test, select Test Connection.
  • Select Ok to save your changes.

SAP ERP connector properties

Here are the properties for the SAP ERP connector:

PropertyControlDescription
Data Source Nametext boxEnter the name of the data source
SAP Application Server Hosttext boxEnter the SAP application server host
SAP System Numbertext boxEnter the SAP system number. The value is between 00 and 99.
SAP Clienttext boxEnter the SAP client number. This number is used to perform the RFC logon to SAP, binding the agent process with the account created during the transport import.
User IDtext boxEnter the SAP user ID
Passwordtext boxEnter the SAP password
BAPI Nametext boxOptionally enter the Business Application Programming Interface (BAPI) name. The default is ZINC\_SAP\_ERP\_DATA\_EXTRACTOR.
Peak Limittext boxOptionally enter the maximum number of active connections that can be created for a destination simultaneously. The default is 10.
Pool Capacitytext boxOptionally enter the maximum number of idle connections kept open by the destination. The default is 3.
Metadata Cache Time in Minutestext boxOptionally enter the metadata cache time in minutes. The default is 60.
Extraction Methoddrop down listSelect the extraction method.
The options are:
  ●  Asynchronous
  ●  Synchronous
  ●  Callback.
The default is Asynchronous.
Refer to the properties of each mode.
Datasource Timezonedrop down listSelect the same timezone of your data source.
If the needed timezone does not exist, select Other.
Othertext boxAvailable onlt when you select Other in Datasource Timezone. Enter the abbreviation of your datasource timezone.
Use Data AgenttoggleDisabled by default. Enable if you are using data agent.
Data Agentdrop downAvailable when you enable Use data agent option. Select the data agent you need.
Enable EncryptiontoggleEnable this property for data encryption.
Important: Data Agent

A data agent is a service that runs on a remote host. It is also a data agent object in the Data Manager for a given tenant. An authentication file shared between the data agent object and the data agent service enables an authorized connection without using a VPN or SSH tunnel. With a data agent, you can securely extract data from one or more databases behind a firewall to an Incorta cluster. Your Incorta cluster can reside on-premises or in the cloud. A CMC Administrator must enable and configure an Incorta cluster to support the use of Data Agents. Only a Tenant Administrator (Super User) or user that belongs to a group with the SuperRole role for a given tenant can create a data agent that connects to a data agent service. To learn more, see Concepts → Data Agent and Tools → Data Agent.

Asynchronous properties
PropertyControlDescription
Process Data At IncortatoggleEnable this property to process data at Incorta
Keep File For DebuggingtoggleEnable this property to save the file for debugging and not to delete it when extraction is finished
Asynchronous File Transfer Methoddrop down listSelect the upload method. The options are: Upload to Incorta Server using SCP Upload to Incorta Server using WinSCP Upload to Incorta Shared Drive Upload to Common Shared Drive
Destination Hosttext boxEnter the destination host IP address to transfer files from SAP to Incorta.
Destination Authentication Methoddrop down listEnter the authentication method. The options are:
  ●  Private Key
  ●  Password
Destination Usernametext boxEnter the destination username of Incorta machine. The default is incorta.
Destination Pathtext boxEnter the destination path. The default is /home/incorta/ASYNC_DATAEXTRACTS/SAPS4HANAIDES
Synchronous properties
PropertyControlDescription
Save File For DebuggingtoggleEnable this property to create and save a debugging file.
Note: To retrieve this file, please contact Incorta Support.
Callback properties
PropertyControlDescription
Gateway Servicetext boxEnter the port number for the SAP gateway service you want to connect to.
Default is 3300.
Server RFC Destinationtext boxEnter the RFC destination name you configured in SAP.
Default is ZINC_FCO_SERVER.
It must be unique per connection.
Server RFC Program IDtext boxEnter the registered server program ID you configured in SAP.
Default is INCORTA_JCO_SERVER.
It must be unique per connection.
Server Worker Threadstext boxEnter the number listening threads in the callback server.
Default is 10.
Max Parallel Callbackstext boxEnter the maximum number of background jobs that can be done by the data source.
Default is 30.
Max Callback Timeouttext boxEnter the maximum waiting time in seconds before considering the extraction as not successful.
Default is 15.

Create a schema with the Schema Wizard

Here are the steps to create an SAP ERP schema with the Schema Wizard:

  • Sign in to the Incorta Direct Data Platform.
  • In the Navigation bar, select Schema.
  • In the Action bar, select + New → Schema Wizard.
  • In (1) Choose a Source, specify the following:
    • For Enter a name, enter the schema name.
    • For Select a Datasource, select the SAP ERP external data source.
    • Optionally create a description.
  • In the Schema Wizard footer, select Next.
  • In (2) Manage Tables, in the Data Panel, first select the name of the Data Source, and then check the Select All checkbox.
  • In the Schema Wizard footer, select Next.
  • In (3) Finalize, in the Schema Wizard footer, select Create Schema.

Create a schema with the Schema Designer

Here are the steps to create an SAP ERP schema using the Schema Designer:

  • Sign in to the Incorta Direct Data Platform.
  • In the Navigation bar, select Schema.
  • In the Action bar, select + New → Create Schema.
  • In Name, specify the schema name, and select Save.
  • In Start adding tables to your schema, select SAP.
  • In the Data Source dialog, specify the SAP ERP table data source properties.
  • Select Add.
  • In the Table Editor, in the Table Summary section, enter the table name.
  • To save your changes, select Done in the Action bar.

SAP ERP table data source properties

For a schema table in Incorta, you can define the following SAP ERP specific data source properties as follows:

PropertyControlDescription
Typedrop down listDefault is SQL Database
Data Sourcedrop down listSelect the SAP ERP external data source
Schemadrop down listSelect the SAP schema
Tabledrop down listSelect the SAP table
Derived Columnstext boxEnter the table derived columns
Selected Columnstext boxEnter the source table columns you would like to include. Use a comma separator between column names with no spaces. For example, MATNR,KOKRS,BELNR. Leave this field empty if you need to include all the source table columns.
Extraction Filtertext boxEnter the filter to apply to the data extracted from the source table. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, COSP~GJAHR = 2021.
Full Load SetuptoggleEnable this property to set up the full load details
Full Load Derived Columnstext boxEnable Full Load Setup to configure this property. Define constant value columns. This is useful in cases where the number of full load columns is less than the number of incremental load columns.
Dependent TabletoggleEnable Full Load Setup to configure this property. Enable this property to specify a dependent table.
Dependent Schemadrop down listEnable Dependent Table to configure this property. Select the dependent schema.
Dependent Tabledrop down listEnable Dependent Table to configure this property. Select the dependent table.
Dependent Join Typedrop down listEnable Dependent Table to configure this property. Select the join type between the source table and dependent table. The options are:
  • INNER JOIN
  • LEFT OUTER JOIN
Join Conditiontext boxEnable Dependent Table to configure this property. Enter the join condition between the source table and dependent table. The join condition requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, BKPF~BUKRS = BSEG~BUKRS.
Dependent Columnstext boxEnable Dependent Table to configure this property. Enter the names of the columns that you would like to include from the dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2. Leave this field empty if you do not want to include any dependent columns.
Dependent Table Extraction Filtertext boxEnable Dependent Table to configure this property. Enter the filter to be applied to the data extracted from the dependent table. This filter will only be considered when you join a cluster and/or pool table with a transparent table. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, CDHDR~OBJECTCLAS IN ('MAKT','MAKTC').
Dependent Table 2toggleEnable Dependent Table to configure this property. Enable this property to specify a second dependent table.
Dependent Schema 2drop down listEnable Dependent Table 2 to configure this property. Select the dependent schema for the second dependent table.
Dependent Table 2drop down listEnable Dependent Table 2 to configure this property. Select the second dependent table.
Dependent Table 2 Join Typedrop down listEnable Dependent Table 2 to configure this property. Select the join type between the source table and the second dependent table. The options are:
  • INNER JOIN
  • LEFT OUTER JOIN
Dependent Table 2 Join Conditiontext boxEnable Dependent Table 2 to configure this property. Enter the join condition between the source table and the second dependent table. The join condition requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, BKPF~BUKRS = BSEG~BUKRS.
Dependent Table 2 Columnstext boxEnable Dependent Table 2 to configure this property. Enter the names of the columns that you would like to include from the second dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2. Leave this field empty if you do not want to include any dependent columns.
Dependent Table 3toggleEnable Dependent Table 2 to configure this property. Enable this property to specify a third dependent table.
Dependent Schema 3drop down listEnable Dependent Table 3 to configure this property. Select the dependent schema for the third dependent table.
Dependent Table 3drop down listEnable Dependent Table 3 to configure this property. Select the third dependent table.
Dependent Table 3 Join Typedrop down listEnable Dependent Table 3 to configure this property. Select the join type between the source table and the third dependent table. The options are:
  • INNER JOIN
  • LEFT OUTER JOIN
Dependent Table 3 Join Conditiontext boxEnable Dependent Table 3 to configure this property. Enter the join condition between the source table and the third dependent table. The join condition requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, BKPF~BUKRS = BSEG~BUKRS.
Dependent Table 3 Columnstext boxEnable Dependent Table 3 to configure this property. Enter the names of the columns that you would like to include from the third dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2. Leave this field empty if you do not want to include any dependent columns.
Additional Full Load Extraction Filtertext boxEnter an additional filter to apply to the data extracted from the source table during a full load. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, COSP~WRTTP IN ('E','V').
Full load Fetch Methoddrop down listSelect the backend database fetch method to use when executing a full load query against the SAP Function Module. The options are:
  • OPEN
  • _
  • SQL (default)
  • OPEN
  • _
  • CURSOR
Use Full Load ChunkingtoggleEnable this property to process the full data load in chunks
Chunking Column Typedrop down listEnable Use Full Load Chunking to configure this property. Select the chunking column type. The options are:
  • Chunking by date or timestamp columns
  • Chunking by string or number column
Manage Full Load Column Selection (e.g. table~column1, table~column2)text boxEnable Use Full Load Chunking to configure this property. Enter the specific columns to use for chunking.
Manage Full Load Start Date (format: yyyyMMdd, e.g. 20181225)text boxEnable Use Full Load Chunking and select Chunking by date or timestamp columns as the Chunking Column Type to configure this property. Enter the full load start date.
Manage Full Load End Date (format: yyyyMMdd, e.g. 20181225)text boxEnable Use Full Load Chunking and select Chunking by date or timestamp columns as the Chunking Column Type to configure this property. Enter the full load end date.
Manage Full Load Increment Bytext boxEnable Use Full Load Chunking and select Chunking by date or timestamp columns as the Chunking Column Type to configure this property. Enter the full load increment quantity. The default is 30.
Manage Full Load Group Restrictiontext boxEnable Use Full Load Chunking and select Chunking by string or number columns as the Chunking Column Type to configure this property. Enter the full load group restriction. The default is 100.
Full Load by Filtertext boxEnable Use Full Load Chunking and select Chunking by Year and Period columns as the Chunking Column Type to configure this property. Enter the full load static extraction filter. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. You can also use a global variable to get dynamic values for the filter condition. For example, COSP~GJAHR = 2021 AND (COSP~PERBL > @@GV_PERIOD OR COSP~BUDGET_PD > @@GV_PERIOD).
Join Conditiontext boxEnable Dependent Table to configure this property. Enter the join condition between the source table and dependent table.
Incremental LoadtoggleEnable the incremental load configuration for the schema table
Incremental Load Typedrop down listEnable Incremental to configure this property. Select the incremental load type. The options are:
  • Incremental Load by date or timestamp columns
  • Incremental Load by Static Filter
  • Chunking by string or number columns
Incremental Load Derived Columnstext boxEnable Incremental Load Setup to configure this property. Define constant value columns. This is useful in cases where the number of incremental load columns is less than the number of full load columns.
Incremental Dependent TabletoggleEnable Incremental Load Setup to configure this property. Enable this property to specify an incremental dependent table.
Incremental Dependent Schemadrop down listEnable Incremental Dependent Table to configure this property. Select the incremental dependent schema.
Incremental Dependent Tabledrop down listEnable Incremental Dependent Table to configure this property. Select the incremental dependent table.
Incremental Dependent Join Typedrop down listEnable Incremental Dependent Table to configure this property. Select the join type between the source table and incremental dependent table. The options are:
  • INNER JOIN
  • LEFT OUTER JOIN
Incremental Join Conditiontext boxEnable Incremental Dependent Table to configure this property. Enter the join condition between the source table and incremental dependent table. The join condition requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, BKPF~BUKRS = BSEG~BUKRS.
Incremental Dependent Columnstext boxEnable Incremental Dependent Table to configure this property. Enter the names of the columns that you would like to include from the incremental dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2. Leave this field empty if you do not want to include any incremental dependent columns.
Incremental Dependent Table Extraction Filtertext boxEnable Incremental Dependent Table to configure this property. Enter the filter to be applied to the data extracted from the incremental dependent table. This filter will only be considered when you join a cluster and/or pool table with a transparent table. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, CDHDR~OBJECTCLAS IN ('MAKT','MAKTC').
Incremental Dependent Table 2toggleEnable Incremental Dependent Table to configure this property. Enable this property to specify a second incremental dependent table.
Incremental Dependent Schema 2drop down listEnable Incremental Dependent Table 2 to configure this property. Select the dependent schema for the second incremental dependent table.
Incremental Dependent Table 2drop down listEnable Incremental Dependent Table 2 to configure this property. Select the second incremental dependent table.
Incremental Dependent Table 2 Join Typedrop down listEnable Incremental Dependent Table 2 to configure this property. Select the join type between the source table and the second incremental dependent table. The options are:
  • INNER JOIN
  • LEFT OUTER JOIN
Incremental Dependent Table 2 Join Conditiontext boxEnable Incremental Dependent Table 2 to configure this property. Enter the join condition between the source table and the second incremental dependent table. The join condition requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, BKPF~BUKRS = BSEG~BUKRS.
Incremental Dependent Table 2 Columnstext boxEnable Incremental Dependent Table 2 to configure this property. Enter the names of the columns that you would like to include from the second dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2. Leave this field empty if you do not want to include any dependent columns.
Incremental Dependent Table 3toggleEnable Incremental Dependent Table 2 to configure this property. Enable this property to specify a third incrementaldependent table.
Incremental Dependent Schema 3drop down listEnable Incremental Dependent Table 3 to configure this property. Select the dependent schema for the third incremental dependent table.
Incremental Dependent Table 3drop down listEnable Incremental Dependent Table 3 to configure this property. Select the third incremental dependent table.
Incremental Dependent Table 3 Join Typedrop down listEnable Incremental Dependent Table 3 to configure this property. Select the join type between the source table and the third incremental dependent table. The options are:
  • INNER JOIN
  • LEFT OUTER JOIN
Incremental Dependent Table 3 Join Conditiontext boxEnable Dependent Table 3 to configure this property. Enter the join condition between the source table and the third incremental dependent table. The join condition requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, BKPF~BUKRS = BSEG~BUKRS.
Incremental Dependent Table 3 Columnstext boxEnable Incremental Dependent Table 3 to configure this property. Enter the names of the columns that you would like to include from the third incremental dependent table. Use a comma separator between column names with no spaces. For example, CPUDT,AEDAT,HWAE2. Leave this field empty if you do not want to include any dependent columns.
Additional Incremental Load Extraction Filtertext boxEnter an additional filter to apply to the data extracted from the source table during an incremental load. The extraction filter requires a column to be referenced as <TABLE_NAME>~<COLUMN_NAME>. For example, COSP~WRTTP IN ('E','V').
Incremental load Fetch Methoddrop down listSelect the backend database fetch method to use when executing an incremental load query against the SAP Function Module. The options are:
  • OPEN
  • _
  • SQL (default)
  • OPEN
  • _
  • CURSOR
Incremental Load Typedrop down listEnable Incremental to configure this property. Select the incremental load type. The options are:
  • Incremental Load by date or timestamp columns
  • Incremental Load by Static Filter
  • Chunking by string or number columns
  • Chunking by Year and Period columns
Incremental Extract Usingdrop down listSelect Incremental Load by date or timestamp columns or Chunking by string or number columns as the Incremental Load Type to configure this property. Select one of the following options for the incremental load extract:
  • Last Successful Extract Time (
  • Incremental Load by date or timestamp column
  • only)
  • Get Chunking Values Filter (
  • Chunking by string or number columns
  • only)
  • Maximum Value of a Column
Use Change Management Document Tables for Delta RecordstoggleSelect Incremental Load by date or timestamp columns as the Incremental Load Type to configure this property. Enable this property to use the change management document tables for the delta records
Incremental Load Column Selection (e.g. table~column1, table~column2)text boxSelect Incremental Load by date or timestamp columns, Chunking by string or number columns, or Chunking by Year and Period columns as the Incremental Load Type to configure this property. Enter the specific columns to use for incremental loading.
Incremental Load Start Date (format: yyyyMMdd, e.g. 20181225)text boxSelect Incremental Load by date or timestamp columns or Chunking by Year and Period columns as the Incremental Load Type to configure this property. Enter the incremental load start date.
Incremental Load End Date (format: yyyyMMdd, e.g. 20181225)text boxSelect Incremental Load by date or timestamp columns or Chunking by Year and Period columns as the Incremental Load Type to configure this property. Enter the incremental load end date.
Incremental Load Increment Bytext boxSelect Incremental Load by date or timestamp columns as the Incremental Load Type to configure this property. Enter the incremental load increment quantity.
Incremental Load by Filtertext boxSelect Incremental Load by Static Filter as the Incremental Load Type to configure this property. Enter the incremental load static filter.
Incremental Load Get Chunking Group Value Filtertext boxSelect Chunking by string or number columns as the Incremental Load Type to configure this property. Enter the incremental load get chunking group value filter.
Manage Incremental Load Group Restrictiontext boxSelect Chunking by string or number columns as the Incremental Load Type to configure this property. Enter the incremental load group restriction. The default is 100.
Packet size in MBtext boxEnter the packet size in MB. The default is 10.
Use Parallel Process in SAP Function ModuletoggleThis property is only applicable if Asynchronous was selected for the Extraction Method during the associated external data source creation. Enable this property to use parallel queries to load the data. Instead of using a single process for one SQL statement, in parallel queries the work is spread across multiple processes. This is useful where there is a lot of data in operations like full table scans of large tables.
Use Parallel Processing For Chunking in Asynchronous MethodtoggleThis property is only applicable if Asynchronous was selected for the Extraction Method during the associated external data source creation. Enable this property to load data chunks (batches) in parallel. If disabled, the system will load data chunks sequentially (first chunk will be loaded first, then the second fetch, etc.).
CallbacktoggleTurn this option on to enable the Callback URL field
Callback URLtext boxThis property appears when the Callback toggle is enabled. Specify the URL.

View the schema diagram with the Schema Diagram Viewer

Here are the steps to view the schema diagram using the Schema Diagram Viewer:

  • Sign in to the Incorta Direct Data Platform.
  • In the Navigation bar, select Schema.
  • In the list of schemas, select the SAP ERP schema.
  • In the Schema Designer, in the Action bar, select Diagram.

Load the schema

Here are the steps to perform a Full Load of the SAP ERP schema using the Schema Designer:

  • Sign in to the Incorta Direct Data Platform.
  • In the Navigation bar, select Schema.
  • In the list of schemas, select the SAP ERP schema.
  • In the Schema Designer, in the Action bar, select LoadFull Load.
  • To review the load status, in Last Load Status, select the date.

Explore the schema

With the full load of the SAP ERP schema complete, you can use the Analyzer to explore the schema, create your first insight, and save the insight to a new dashboard.

To open the Analyzer from the schema, follow these steps:

  • In the Navigation bar, select Schema.
  • In the Schema Manager, in the List view, select the SAP ERP schema.
  • In the Schema Designer, in the Action bar, select Explore Data.