Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Structured File Lens + Kafka + Lens Writer

Once you have your Structured File Lens and your Lens Writer up and running, explained below is an example of an end-to-end enterprise-ready highly-scalable system showing what is required to ingest your structured files (CSV/XML/JSON) into your Knowledge or Property Graph. The intended flow of your data through the systems is as follows:

...

When provenance data is generated in the Structured File Lens, you have the option to specify a separate output directory location for the generated provenance RDF, and separate a Kafka Success Queue via the PROV_OUTPUT_DIR_URL and PROV_KAFKA_TOPIC_NAME_SUCCESS config options respectively. In the case where you wish to have your provenance data uploaded to a separate Triple Store, an additional Lens Writer is required, this will be configured where by whereby its Kafka topic will be directed to this separate provenance topic specified in the Lens.

...

SQL Lens + Kafka + Lens Writer

Once you have your SQL Lens and your Lens Writer up and running, explained below is an example of an end-to-end enterprise-ready highly-scalable system showing what is required to ingest data from your Relational SQL Databases into your Knowledge or Property Graph. The intended flow of your data through the systems is as follows:

...

As seen in the SQL Lens User Guide, the connection to your Database lies within the mapping files that you have created. The process in order for the Lens to start ingesting data from your DB can be triggered in two ways. One is to use the exposed API Endpoint, this is simply a GET request targeting the Lens, for example, http://<lens-ip>:<lens-port>/process. Another is to use a Cron Expression to set up a time-based job scheduler which will schedule the Lens to ingest your specified data from your database(s) periodically at fixed times, dates, or intervals.

...

When provenance data is generated in the SQL Lens, you have the option to specify a separate output directory location for the generated provenance RDF, and separate a Kafka Success Queue via the PROV_OUTPUT_DIR_URL and PROV_KAFKA_TOPIC_NAME_SUCCESS config options respectively. In the case where you wish to have your provenance data uploaded to a separate Triple Store, an additional Lens Writer is required, this will be configured where by whereby its Kafka topic will be directed to this separate provenance topic specified in the Lens.

...

RESTful Lens + Kafka + Lens Writer

Once you have your RESTful Lens and your Lens Writer up and running, explained below is an example of an end-to-end enterprise-ready highly-scalable system showing what is required to ingest data from your REST API endpoint into your Knowledge or Property Graph. The intended flow of your data through the systems is as follows:

...

When provenance data is generated in the RESTful Lens, you have the option to specify a separate output directory location for the generated provenance RDF, and separate a Kafka Success Queue via the PROV_OUTPUT_DIR_URL and PROV_KAFKA_TOPIC_NAME_SUCCESS config options respectively. In the case where you wish to have your provenance data uploaded to a separate Triple Store, an additional Lens Writer is required, this will be configured where by whereby its Kafka topic will be directed to this separate provenance topic specified in the Lens.

...

Document Lens + Kafka + Lens Writer

Once you have your Document Lens and your Lens Writer up and running, explained below is an example of an end-to-end enterprise-ready highly-scalable system showing what is required to ingest your document files (PDF/doc(x)/txt) into your Knowledge or Property Graph. The intended flow of your data through the systems is as follows:

...

When provenance data is generated in the Document Lens, you have the option to specify a separate output directory location for the generated provenance RDF, and separate a Kafka Success Queue via the PROV_OUTPUT_DIR_URL and PROV_KAFKA_TOPIC_NAME_SUCCESS config options respectively. In the case where you wish to have your provenance data uploaded to a separate Triple Store, an additional Lens Writer is required, this will be configured where by whereby its Kafka topic will be directed to this separate provenance topic specified in the Lens.

...