Objects API

The Objects API lets you migrate objects between Fusion instances by exporting and importing. Fusion objects include all your searchable data, plus pipelines, aggregations, and other configurations on which your collections depend.

You can select all objects, or limit the operation to specific object types or IDs. In addition to export/import endpoints, a validation endpoint is provided for troubleshooting.

This service was introduced in Fusion 3.0.

By default, system-created collections are not exported. See below for details.

Object export and import

Collections and encrypted values are treated specially; details are provided below. During import, conflicts are resolved according to the specified import policy.

For objects other than collections, no implicit filtering is performed; all objects are included by default. However, on export you can filter by type and ID.

Supported objects

Fusion lets you export and import these types of objects:

  • collection

  • index-pipeline

  • query-pipeline

  • search-cluster

  • datasource

  • banana

  • parser

  • group

  • link

  • task

  • job

  • spark

Exporting and importing collections

Collections are processed with these dependent objects:

  • features

  • index profiles

  • query profiles

Datasources, parser configurations, and pipeline configurations are not included when collections are exported or imported. These must be exported and imported explicitly.

Only user-created collections are included by default. Certain types of collections are excluded:

  • the "default" collection

  • collections whose type is not DATA

  • collections whose names start with "system_"

  • "Secondary" collections, that is, collections created by features

    Instead, create the same features on the target system; this automatically creates the corresponding secondary collections.

You can override these exclusions by specifying a collection, like this:


Encrypted passwords

Some objects, such as datasources and pipelines, include encrypted passwords for accessing remote data.

  • On export, these encrypted values are replaced with ${secret.n.nameOfProperty}.

  • On import, the original, plaintext passwords must be provided in a JSON map:

    {"secret.1.bindPassword" : "abc", "secret.2.bindPassword" : "def"}

    The file must be supplied as multipart form data.

Variables that do not start with secret. are ignored.

Import policies

On import, the importPolicy parameter is required. It specifies what to do if any object in the import list already exists on the target system:


If there are conflicts, then import nothing.


If there are conflicts, then skip the conflicting objects.


If there are conflicts, then overwrite or delete/create the conflicting objects on the target system.

Filtering on export

On export, there are two ways to specify the objects to include:

  • by type

    You can specify a list of object types to export all objects of those types. Valid values:

    • collection

    • index-pipeline

    • query-pipeline

    • search-cluster

    • datasource

    • banana

    • parser

    • group

    • link

    • task

    • job

    • spark

  • by type and ID

    The type.ids parameter lets you list the IDs to match for the specified object type.

The type and type.ids parameters can be combined as needed.

Exporting linked objects

Related Fusion objects are linked. You can view linked objects using the Links API or the Object Explorer.

When exporting a specific Fusion object, you can also export its linked objects without specifying each one individually. To export all objects linked to the specified object, include the deep="true" query parameter in your request. See the example below. When deep is "true", Fusion follows these link types:

  • DependsOn

  • HasPart

  • RelatesTo


Objects are validated before import. If any objects fail validation, the whole import request is rejected. A separate endpoint is available for validating objects without importing them.

Validation includes checking whether an object already exists on the target system and whether the user is authorized to create or modify the object.

For collection objects, the following special validation is performed:

  • We check the searchClusterId of each collection and verify that a cluster with this ID exists on the target system or in the import file (error).

  • We check that features, index profiles, and query profiles belong only to the collections specified in the import file (error).

  • We check that a feature exists on the target system for each feature in the import file (error).

  • We check for index profiles or query profiles that do not exist on the target system or in the import file (warning).

For job objects, which contain schedule configurations, Fusion only imports them if their associated task, datasource, or spark objects are also present, either on the target host or in the import file.

Status messages

Validation completed with no errors

The validation method was called and no errors found, though there may be warnings.

Validation found errors

The validation was called and errors found. Validation does not stop on the first error, so the complete list of errors is reported.

Validation was not completed because of system error

The validation was interrupted by system error.

Import was not performed because validation errors exist

The import method was called, but import didn’t start because of validation errors.

Import was not performed because of input data error

The import method was called, but import didn’t start, because Fusion could not find a substitution for one of the secret values in objects in import.

Import was not completed because of system error

The validation found no errors and import started, but it was interrupted by system error.

Import was completed

Validation found no errors and import finished successfully.


Export all objects
curl -u user:pass http://localhost:8764/api/apollo/objects/export
Export all datasources
curl -u user:pass http://localhost:8764/api/apollo/objects/export?type=datasource
Export a specific datasource and its linked objects
curl -u user:pass http://localhost:8764/api/apollo/objects/export?export?datasource.ids=movies_csv-ml-movies&deep=true
Export all datasources and pipelines, plus two specific parsing configurations
curl -u user:pass http://localhost:8764/api/apollo/objects/export?type=datasource,index-pipeline,query-pipeline&parser.ids=cinema_parser,metafiles_parser
Import objects from a file and stop if there are conflicts
curl -u user:pass -H "Content-Type:multipart/form-data" -X POST -F 'importData=@/Users/admin/Fusion/export.json' http://localhost:8764/api/apollo/objects/import?importPolicy=abort
Import objects, substitute the password variables, and merge any conflicts
curl -u user:pass -H "Content-Type:multipart/form-data" -X POST -F 'importData=@/Users/admin/Fusion/export.json' -F 'variableValues=@password_file.json' http://localhost:8764/api/apollo/objects/import?importPolicy=merge
password_file.json must contain plaintext passwords.