Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • License - LICENSE

    • This is the license key required to operate the Lens when being ran on a local machine outside of AWS, request your new unique license key here.

  • Lens Directory - LENS_DIRECTORY

    • This is the directory where your mapping file(s) is located.

    • This is the directory where all Lens files are stored (assuming individual file dir config haven’t been edited). On Lens startup, if this has been declared, it will create folders at the specified location for mapping, output, yaml-mapping, provenance output, and config backup.

    • By default, this option is set to a local directory within the docker container (file:///var/local/) so isn't mandatory. As with all directories in the Lens, this can be either local or on a remote S3 bucket .- we recommend using S3 when running the Lens on AWS (for example - s3://example-bucket/sqllens/)

  • Record Provenance - RECORD_PROVO

    • In the Lenses, time-series data is supported as standard, so every time a Lens ingests some data, we add provenance information. This means that you have a full record of data over time. Allowing you to see what the state if the data was at any moment. 

    • When setting up and testing a Lens for the first time, it may be practical to turn off provenance until your environment is ready for production. This is done by setting RECORD_PROVO to false, this can then be turned on at a later date by calling the updateConfig endpoint. Provenance files will then be saved to where the PROV_OUTPUT_DIR_URL option is set to or the provenance directory created by the Lens directory config option.

...

All of our Lenses are designed and built to be versatile, allowing them to be set up and ran on a number of environments, including in cloud or on-premise. This is achieved through the use of Docker Containers. To run the Lens on AWS, simply use the CloudFormation template we have created to start up an ECS Cluster with all the required permissions and networking, with the Lens running within as a task. Alternatively, to run the Lens' Docker image locally, please first ensure you have Docker installed. Once installed, execute a docker run command with the following structure, and Docker will start the container and run the Lens from your downloaded image.

...

The above examples demonstrate how to override configuration options using environment variables in your Lens. Given the Lens is ran on port 8080, line 5 exposes and binds that port of the host machine so that the APIs can be triggered. The -v flag seen on line 6 mounts the working directory into the container; when the host directory of a bind-mounted volume doesn’t exist, Docker will automatically create this directory on the host for you. And finally, line 7 is the name and version of the Docker image you wish to run. For more information of running Docker Images, see the official Docs.

...

Once the ingestion and transformation has successfully been processed, the response returned from the Lens is in the form of a JSON. This JSON response contains a list of all the URLs of generated RDF files as well as some other useful information. Multiple files are only generated when specifying an SQL Limit and Offsetin your mapping query.

Sample output:

Code Block
{
    "successfulIterations": 103,
    "outputFileLocations": [
        "outputFileLocation1": "file:///data/sqllens/output/SQL-Lens-31235bfd-a6fb-43aa-bb51-e3d41c481983.nq",
        "outputFileLocation2": "file:///data/sqllens/output/SQL-Lens-5b27a2a4-c971-4fd4-b128-d131bf5c3981.nq",
        "outputFileLocation3": "file:///data/sqllens/output/SQL-Lens-5ecd6e81-65ff-4dbc-aed8-d6d320b7d04a.nq"
    ],
    "processingTime": 20
}

Cron job

The process to trigger the ingestion of your DB data can be done in two multiple ways, in addition to the RESTful service, there is also a built-in Quartz Time Scheduler. This uses a user-configurable Cron Expression to set up a time-based job scheduler which will schedule the Lens to ingest your specified data from your database(s) periodically at fixed times, dates, or intervals. More information on this can be found in the full User Guide.

Kafka

You can also trigger the Lens by sending a message to a Kafka queue. To utilise this you must ensure LENS_RUN_STANDALONE is set to false and the Kafka configuration is configured correctly for you brokers. Once set up, you can simply push a message to the topic specified by the KAFKA_TOPIC_NAME_SOURCE.