References → SQL Recipe

The code tool is an excellent way to inject Spark SQL into your dataflow. This can be used for simple or complex operations that are not currently available as a recipe.

ConfiguratioDescription
Recipe NameA freeform name of how a user would like to name a recipe
Multiple InputsSelect a previously constructed recipe(s). The SQL tool can ingest multiple inputs.
CodeEnter Spark SQL code to run within the recipe

Or, use Incorta Nexus to generate an intelligent SQL query suggestion:
  ●   Enter the description or desired goal of the SQL query.
  ●   Select the Incorta Nexus icon next to the description field.
  ●   View the generated SQL query based on your input in the description field.

Important pySpark commands

Reference Input Sources

The inputs assigned in the Multiple Inputs setting will populate a new configuration section called Input Variable Name. This will showcase the true name of the underlying data frame. Users can copy these variable names as table names within their SQL. Note that no quotations are needed for specifying tables or columns.

Output Data

For each Code recipe, the code must contain output as a data frame for the recipe to save properly. Please use the function output_df() to save the data frame.

Example: output_df(my_sample_df)