Export

The export API packages all of the files necessary to create a service into a ZIP file. The ZIP file can then be used by the import API to deploy the Spark services to another tenant and/or environment. The API is designed not to proceed if a successful export cannot be completed.

Prerequisites to export Spark services

  • Only 1 job can be executed at once per tenant.

  • All the referenced objects need to exist and accessible with the provided authorization.

  • Any referenced folders most contain a service.

  • All referenced services must have a compiled Neuron WebAssembly module.

  • The requested package size does not exceed the configured limit (default to 200 MB).

Authorization

These APIs support authorization via:

POST export job

Returns: Response from GET export status.

Path parameters

Key
Value

tenant *

Tenant is part of your Log in to Spark URL and also available in the User menu.

Request body

Content-Type: application/json

Note that the union of the input service identifiers will be included as part of the exported package.

Key
Value

inputs.folders

Array of folder names.

inputs.services

Array of C.SPARK_XCALL() format of {folder}/{service} or {folder}/{service}[{version}] or service_id.

If a version is specified, follow the version convention to take the latest version with a starts with match. e.g. 3.2 will grab the latest semantic version that starts with 3.2. Example 1: myfolder/myservice Example 2: myfolder/myservice[1.4.3] Example 3: 5edf95a1-96f3-4a53-b9a4-9ff382bd9936 Example 4: ["myfolder/myservice1", "yourfolder/service2"]

inputs.version_ids

Array of version_ids.

file_filter

Filter the requested files.

  • The default value is migrate which will export all of the files relevant to the service.

  • onpremises includes only the files that are needed for the Hybrid Runner.

source_system

Tag API Call.

Example: mycicd

correlation_id

Tag API Call.

Example: 456

version_filter

Filter service versions.

  • The default value is all which will export all of the service versions for the identified folders and services.

  • latest will only provide the latest service version for each service. If latest is used and there are references to a specific service version, the export will generate an error.

file_name

Name of the downloaded file. If not provided, Spark will use an appropriate name.

Sample request

Response

HTTP 200 OK Content-Type: application/json

Returns the response from GET export status.

GET export status

Returns: status response and link to download the ZIP file.

Path parameters

Key
Value

tenant *

Tenant is part of your Log in to Spark URL and also available in the User menu.

jobId *

Response

Content-Type: application/json

Key
Value

object

export

id

id for the job.

response_timestamp

Response timestamp.

status

  • created Job registered on Spark.

  • in_progress Job in progress.

  • closed Job completed successfully.

  • closed_by_timeout Job was unable to complete within 15 minutes.

  • failed Job was not able to complete.

status_url

Link to the GET status API.

process_time

Time taken for the job.

outputs.files

Array of the output files:

  • file Link to the package. Filename of the exported object is package.zip. This can be overridden with the file_name parameter.

  • file_hash SHA-256 of the packaged file.

outputs.services

Array of the exported services sorted by folder, service:

  • service_uri_source in C.SPARK_XCALL() format of {folder}/{service}

  • folder_source

  • service_source

  • service_id_source

outputs.service_versions

Array of the exported service versions sorted by folder_source, service_source, version_source:

  • service_uri_source in C.SPARK_XCALL() format of {folder}/{service}[{version}]

  • folder_source

  • service_source

  • version_source (semantic version)

  • service_id_source

  • version_id_source

source_system

Value from POST export job.

correlation_id

Value from POST export job.

Sample response

HTTP 200 OK Content-Type: application/json

PATCH export

Change export job status. This can cancel an errant job.

Returns: Response from GET export status.

Path parameters

Key
Value

environment *

Environment is part of your Log in to Spark URL.

tenant *

Tenant is part of your Log in to Spark URL and also available in the User menu.

jobId *

Request

Content-Type: application/json

Response

HTTP 200 OK Content-Type: application/json

Returns the response from GET export status.

GET export status across the tenant

Get information about export jobs that are in_progress_exports or recent_exports within the past 1 h.

Returns: Export jobs.

  • If you are a supervisor:pf user, you will be able to see all exports run by users within your tenant.

  • Otherwise, you will only see information about the batches that were initiated by yourself.

Path parameters

Key
Value

environment *

Environment is part of your Log in to Spark URL.

tenant *

Tenant is part of your Log in to Spark URL and also available in the User menu.

Sample response

HTTP 200 OK Content-Type: application/json

Last updated