How To
Documentation
    Learn More

      Spark Jobs

      Apache Spark can power a wide variety of data analysis jobs. In Fusion, Spark jobs are especially useful for generating recommendations.

      Spark job subtypes

      For the Spark job type, the available subtypes are listed below.

      • Aggregation

        Define an aggregation job.

      • custom Python job.

        The Custom Python job provides user the ability to run Python code via Fusion. This job supports Python 3.6+ code.

      • Script

        Run a custom Scala script as a Fusion job.

      Additional Spark jobs are available with a Fusion license.

      Spark job configuration

      Spark jobs can be created and modified using the Fusion UI or the Spark Jobs API. They can be scheduled using the Fusion UI or the Jobs API.

      For the complete list of configuration parameters for all Spark job subtypes, see the Jobs Configuration Reference.