Spark Jobs
Table of Contents
Apache Spark can power a wide variety of data analysis jobs. In Fusion, Spark jobs are especially useful for generating recommendations.
Spark job subtypes
For the Spark job type, the available subtypes are listed below.
-
Define an aggregation job.
-
Run a custom Spark job.
-
Run a custom Scala script as a Fusion job.
Additional Spark jobs are available with a Fusion AI license. |
Spark job configuration
Spark jobs can be created and modified using the Fusion UI or the Spark Jobs API. They can be scheduled using the Fusion UI or the Jobs API.
For the complete list of configuration parameters for all Spark job subtypes, see the Jobs Configuration Reference.