Adapter playbook

The Adapter playbook provides guidance on how to integrate Spark with different AS400 applications. The diagram below shows the Spark Adapter solution architecture.

General requirements

Local environment

Java

Maven

Docker container environment

Docker

Docker-Compose

Adapter setup guides

Adapter DML LifeAsia

Adapter DML LifeAsia's API accepts a POST request with a sparkModel, excel file, and based of the file given, with the templates and copybooks will create a DML result as an API response in string format.

Here are some links and references for the Adapter DML LifeAsia.

Populating the resources folder

Provide and paste your copybooks and templates that are necessary, into the resources folder.

Local environment setup

Install local jar dependencies

Build

Run

or via

Test

Docker environment setup

Build

Please reference Docker's documentation on buildarrow-up-right.

Run

Unix

Windows

Logs

Please reference Docker's documentation on runarrow-up-right and logsarrow-up-right.


Adapter DML Ingenium

Adapter DML Ingenium's API accepts a POST request with a sparkModel, excel file, and based of the file, will create a DML result as an API response in JSON format.

Here are some links and references for the Adapter DML Ingenium:

Local environment setup

Build

Run

or via

Test

Docker environment setup

Build

Please reference Docker's documentation on buildarrow-up-right.

Run

Logs

Please reference Docker's documentation on runarrow-up-right and logsarrow-up-right.


Adapter Externaliser

The externaliser is an adapter that provides calculations from Spark by listening for a request in the message queue from an AS400 server and providing the calculation result back to message queue as a response.

Here are some links and references for a Adapter Externaliser.

Environment configurations

Provide the proper values and credentials in the environment file through the .ini file, values such as the following:

  • SERVER - AS400 System application server name.

  • AS400_USER - AS400 System application user name.

  • AS400_PASSWORD - AS400 System application password.

  • REQUESTQUEUE - Request data queue.

  • RESPONSEQUEUE - Response data queue.

  • LIBRARY - System library name.

  • INPUTSIZE - Maximum input size.

  • JADELEGACY - Service API Link.

  • PROXY_IP - Proxy's IP address, used for connection to call Spark.

  • PROXY_UID - Proxy's identification, used for connection to call Spark.

  • PROXY_PWD - Proxy's password, used for connection to call Spark.

  • SYNTHETIC_KEY - Spark Authentication Key.

  • TENANT - Spark Tenant Name.

  • SPARK_URL - Spark excel engine service url.

  • TARGET_SERVICE_OUTPUT - Key name mapping of the result.

  • SENTRY_DSN - Sentry data source name.

  • SENTRY_DEBUG - Enable sentry debug logs.

Create, use and paste your API key to the environment variable SYNTHETIC_KEY in the .ini file in /bin. Please reference Coherent's documentation on Authorization - API keys.

Local environment setup

Build

Run Run Build (make sure to have the .ini file in the bin folder)

Unix

Windows

Test

In AS400, access you AS400 server and enter your credentials.

Enter your library.

Enter the necessary inputs such as service name, key, tag names and values. Then press enter to continue.

In the Externaliser logs will appear from the request. Wait for the response from the externaliser to display on the AS400.

Docker Environment Setup

Build

Please reference Docker's documentation on buildarrow-up-right.

Run

Logs

Please reference Docker's documentation on runarrow-up-right and logsarrow-up-right.


Adapter Transport

Adapter Transport service API accepts POST requests using JSON format in body to create a file containing the DML result.

Here are some links and references for a Adapter Transport:

Environment configurations

A sample configuration file is located in src/main/resources/application.yml. It is a standard spring boot configuration.

HTTP configuration

A minimum configuration for HTTP server is as follows.

Spark configuration

  • host: location of your Spark installation.

  • tenant: name of your tenant.

  • api.key: Spark API key.

  • context: path of your webhook, defaults to /post.

Adapter configuration

  • url: location of the adapter.

Local environment setup

Build

Run

or via

Test

A successful webhook/API request would output testoutput.35e15e92-bcbe-4dc1-ac79-3c19f309d67a.dml within the root app folder in the docker container.

Docker environment setup

Build

Please reference Docker's documentation on buildarrow-up-right.

Run

Logs

Please reference Docker's documentation on runarrow-up-right and logsarrow-up-right.


Docker Compose

Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration.

A sample docker-compose file might include multiple services such as adapter transport service, adapter dml ingenium and others.

  • docker-compose build - Build or rebuild services.

  • docker-compose up - Create and start containers.

  • docker-compose stop - Stop services.

  • docker-compose start - Start services.

  • docker-compose down - Stop and remove containers, networks.

  • docker-compose logs - View output from containers.

Please reference the Docker Compose documentationarrow-up-right.

Last updated