References → Python Recipe
The Python recipe is an excellent way to inject PySpark into your dataflow. This can be used for simple or complex operations that are not currently available as a recipe.
Configuration
Configuration | Description |
---|---|
Recipe Name | A freeform name of how a user would like to name a recipe |
Input | Select a previously constructed recipes. The code tool can ingest multiple inputs. |
Change Column Type | Enter PySpark code to run within the recipe |
Important pySpark commands
Input Data
The inputs assigned in the Multiple Inputs setting will populate a new configuration section called Input Variable Name. This will showcase the true name of the underlying data frame. Users can copy these variable names as input data frames within their code.
Output Data
For each Code recipe, the code must contain output as a data frame for the recipe to save properly. Please use the function output_df()
to save the data frame.
Example: output_df(my_sample_df)