Quick Start Guide - Structured File Lens v2.0
This is a quick start guide to get the Structured File Lens up and running in the quickest and simplest possible way so you can start ingesting and transforming data straight away. For a more in-depth set of instructions go to the User Guide.
In this guide we will be setting up and running the Lens as a docker image deployed to your local machine or as the AWS Marketplace offering run on ECS, however we support a number of cloud deployments technologies, including full support of AWS. Once deployed, you can utilise any of our ready-made sample mapping files and expected output files to test your Lens.
Â
1. Creating a Mapping File
The first step in configuring the Structured File Lens is to create a mapping file. The mapping file is what creates the links between your source database and your target model (ontology). To create a mapping file, we have created a detailed step by step guide on creating one, along with a number of examples. The Structured File Lens is capable of ingesting XML, CSV, and JSON files, and the creation of mapping files differ slightly between file types so ensure to select the correct options for your use case.
Â
2. Configuring the Lens
All Lenses supplied by Data Lens are configurable through the use of Environment Variables. There are number of options to tailor the Lens to your needs, however to get started, the following config options are recommended:
License -
LICENSE
This is the license key required to operate the Lens when being run on a local machine outside of AWS, request your new unique license key here.
Lens Directory -
LENS_DIRECTORY
This is the directory where all Lens files are stored (assuming individual file dir config haven’t been edited). On Lens startup, if this has been declared, it will create folders at the specified location for mapping, output, yaml-mapping, provenance output, and config backup.
By default, this option is set to a local directory within the docker container (
file:///var/local/
) so isn't mandatory. As with all directories in the Lens, this can be either local or on a remote S3 bucket - we recommend using S3 when running the Lens on AWS (for example -s3://example-bucket/sflens/
)
Record Provenance -
RECORD_PROVO
In the Lenses, time-series data is supported as standard, so every time a Lens ingests some data, we add provenance information. This means that you have a full record of data over time. Allowing you to see what the state if the data was at any moment.Â
When setting up and testing a Lens for the first time, it may be practical to turn off provenance until your environment is ready for production. This is done by setting
RECORD_PROVO
tofalse
, this can then be turned on at a later date by calling theupdateConfig
endpoint. Provenance files will then be saved to where thePROV_OUTPUT_DIR_URL
option is set to or the provenance directory created by the Lens directory config option.
Â
3. Running the Lens
All of our Lenses are designed and built to be versatile, allowing them to be set up and run on a number of environments, including in cloud or on-premise. This is achieved through the use of Docker Containers. To run the Lens on AWS, simply use the CloudFormation template we have created to start up an ECS Cluster with all the required permissions and networking, with the Lens running within as a task. Alternatively, to run the Lens' Docker image locally, please first ensure you have Docker installed. Once installed, execute a docker run command with the following structure, and Docker will start the container and run the Lens from your downloaded image. NB The tag for the image you are using may differ to the example here.
For UNIX based machines (macOS and Linux):
docker run \
-e LICENSE=<<<REQUEST LICENSE>>> \
-e LENS_DIRECTORY=file:///data/sflens/ \
-e RECORD_PROVO=false \
-p 8080:8080 \
-v /User/DataLens/sfens/:/data/sflens/ \
lens-static:Release_2.0.4.250
For Windows:
docker run ^
-e LICENSE=<<<REQUEST LICENSE>>> ^
-e LENS_DIRECTORY=file:///data/sflens/ ^
-e RECORD_PROVO=false ^
-p 8080:8080 ^
-v //c//User/DataLens/sfens/:/data/sflens/ ^
lens-static:Release_2.0.4.250
The above examples demonstrate how to override configuration options using environment variables in your Lens. Given the Lens is run on port 8080, line 5 exposes and binds that port of the host machine so that the APIs can be triggered. The -v
 flag seen on line 6 mounts the working directory into the container; when the host directory of a bind-mounted volume doesn’t exist, Docker will automatically create this directory on the host for you. And finally, line 7 is the name and version of the Docker image you wish to run. For more information of running Docker Images, see the official Docs.
Â
4. Ingesting Data / Triggering the Lens
The easiest way to ingest a file into the Structured File Lens is to use the built-in endpoint. Using the process
endpoint, you can specify the URL of a file to ingest, along with the logicalSource correlating to your mapping, and in return you will be provided with the URL(s) of the generated RDF data file(s).
For example, using a GET request:
<lens-ip>:<lens-port>/process?inputFileURLs=<input-file-url>&logicalSources=<logical-source>
→ http://127.0.0.1:8080/process?inputFileURLs=file:///var/local/input-data.csv&logicalSources=sample.csv
Once an input file has successfully been processed, the response returned from the Lens is in the form of a JSON, and within the JSON response is the outputFileLocations
element. This element contains a list of all the URLs of generated RDF files. (Multiple files are only generated when ingesting large CSV files).
Sample output:
{
"input": [
"file:///var/local/input/input-data.csv"
],
"failedIterations": 0,
"successfulIterations": 1,
"outputFileLocations": [
"file:///var/local/output/Structured-File-Lens-44682bd6-3fbc-429b-988d-40dda8892328.nq"
],
"mappingFiles": [
"file:///var/local/mappings/mapping.ttl"
]
}
Kafka
You can also trigger the Lens by sending a message to a Kafka queue. To utilise this you must ensure LENS_RUN_STANDALONE
is set to false
and the Kafka configuration is configured correctly for you brokers. Once set up, you can simply push a message containing the URL of the file to the topic specified by the KAFKA_TOPIC_NAME_SOURCE
.