Import
The import API receives the ZIP file from the ExportAPI and includes many options to add, replace, or update services in the destination environment. The API is designed not to proceed if a successful import cannot be completed.
Prerequisites to import Spark services
Only
1job can be executed at once per tenant.Any destination folders need to already exist and be accessible with the provided authorization.
The uploaded package size does not exceed the configured limit (default to
200 MB).
Import actions for Spark services
The services_modify and services_existing parameters can be used to control the destination and overwrite behaviors.
The
services_modifyservice_uri_sourceandservice_uri_destinationparameters can be used to map each source service to a destination. This is useful if the folder and service names in production are different than in a lower environment.The
services_existingparameter controls the behavior if the destination already exists.Where possible, the existing
version_idof a service version will be migrated to the new environment (see Execute API (v3) for more information about theversion_id).
Service does not currently exist
does not impact import action
A new service will be added with all of the service versions included in the source.
Service currently exists
not defined
The upload activity will fail.
Service currently exists
abort
The upload activity will fail.
Service currently exists
replace
The destination service and service versions will be replaced by the service in the source. Note that the previous API Call History for the service will no longer be accessible.
Service currently exists
add_version
The destination service will be updated with the service versions included from the source. The semantic versioning will follow the parameter provided in update_version_type.
Authorization
Bearer {token}accessible from Authorization - Bearer token or systematically via Client Credentials.The request headers should include a key for
Authorizationwith the valueBearer {token}.
API key created from Authorization - API keys. The API key groups must contain User groups that are also assigned to Permissions - Features permissions
Spark.Imports.jsonorSpark.AllEncompassingProxy.json.The request headers should include the keys
x-synthetic-keyandx-tenant-namewith the values of the API key and tenant name respectively.
POST import job
POST import jobReturns: Response from GET import status.
Path parameters
tenant *
Tenant is part of your Log in to Spark URL and also available in the User menu.
Request body
Content-Type: multipart/form-data
file *
File to import from the previous step.
importRequestEntity
Stringified JSON content of the importRequestEntity parameters. If not provided, sensible defaults will be attempted for the import.
importRequestEntity parameters
importRequestEntity parametersinputs.services_modify
Array of services modify objects. If not provided, it is assumed that the destination is taken from the source folder and service name.
The primary use of this parameter is to modify the destination of the exported service. Settings will apply only to the noted services.
JSON Array of the imported service settings:
service_uri_sourceinC.SPARK_XCALL(){folder}/{service}reference the service provided in the source.service_uri_destinationinC.SPARK_XCALL(){folder}/{service}format to change destinationfolderandservice. If omitted assume the same asservice_uri_source.update_version_typemajor,minor, orpatch, if not provided assumeminoras default. This only applies whenservices_existingisadd_version. Alternatively ifcopyis provided, then Spark will try to use the exact same semantic versions as the source.
See the sample request for an example.
services_existing
Optional parameter that defines how to process the service if there is an existing service in the destination folder with the same service name exists and is accessible.
The default value is
abortwhich means the import will fail if there is already an existing service with in the specified destination folder and service.replacemeans the entire service will be removed and replaced with the service versions in the imported file. Note that the previous API Call History for the service will no longer be accessible.If there are existing overlapping semantic versions between the destination and source, this will maintain the existing
version_id.
add_versionmeans that any existing services will be updated with the service versions included in the file.
source_system
Tag API call.
Example: mycicd
correlation_id
Tag API call.
Example: 456
Sample request
The exported ZIP contains a service taken from the folder
myproductservicemyservice.The destination for this service is the folder
newproductand servicenewservice.The service version semantic versions will be incremented with a
majorrevision number, e.g. from1.3.1to2.0.0.
importRequestEntity
Response
HTTP 200 OK Content-Type: application/json
Returns the response from GET import status.
GET import status
GET import statusReturns: status response.
Path parameters
tenant *
Tenant is part of your Log in to Spark URL and also available in the User menu.
jobId *
id from POST import job.
Response
Similar to the Export API Request body, with the following differences:
outputs.services
Array of the imported services sorted by folder_source, service_source:
service_uri_sourceinC.SPARK_XCALL()format of{folder}/{service}folder_sourceservice_sourceservice_id_sourceservice_uri_destinationinC.SPARK_XCALL()format of{folder}/{service}folder_destinationservice_destinationservice_id_destinationstatus:addedservice did not exist and it was added.replacedservice exists and was replaced.updatedservice exists and new service versions were added.
outputs.service_versions
Array of the imported service versions sorted by folder_source, service_source, version_source:
service_uri_sourceinC.SPARK_XCALL()format of{folder}/{service}[{version}]folder_sourceservice_sourceversion_source(semantic version)service_id_source (not possible to get this value)
version_id_sourceservice_uri_destinationinC.SPARK_XCALL()format of{folder}/{service}[{version}]folder_destinationservice_destinationversion_destinationservice_id_destinationversion_id_destinationstatus:addedservice version was added.replacedservice version existed and was replaced.
Sample response
HTTP 200 OK Content-Type: application/json
PATCH import
PATCH importChange import job status. This can cancel an errant job.
Returns: Response from GET import status.
Path parameters
environment *
Environment is part of your Log in to Spark URL.
tenant *
Tenant is part of your Log in to Spark URL and also available in the User menu.
jobId *
id from POST import job.
Request
Content-Type: application/json
Response
HTTP 200 OK Content-Type: application/json
Returns the response from GET import status.
GET import status across the tenant
GET import status across the tenantGet information about import jobs that are in_progress_imports or recent_imports within the past 1 h.
Returns: Import jobs.
If you are a
supervisor:pfuser, you will be able to see all imports run by users within your tenant.Otherwise, you will only see information about the batches that were initiated by yourself.
Path parameters
tenant *
Tenant is part of your Log in to Spark URL and also available in the User menu.
Sample response
HTTP 200 OK Content-Type: application/json
Last updated
