Data flows

The most important items in ION Connect are the data flows.

A data flow is a sequence of activities that send or receive documents. Data flows are event-driven. When a document is published by an application, the next step in the flow is triggered. Each flow starts and ends with one or more connection points. Connection points can be reused in multiple data flows, and the same connection point can be used multiple times in a data flow.

Document flows

Usually, a document flow represents a business process. For example:

  • An invoicing process where an invoice is created for a delivery that was shipped.
  • A maintenance process where a service order is planned based on a customer repair request.

In a simple document flow, two connection points are involved. One connection point sends a specific type of documents, and the other connection point receives those documents. Document flows can be more complex.

In addition to connection points, document flows can contain other items:

  • Mappings are used to translate a document to another format. For example, if you retrieve custom sales order documents from your database, you must translate them to standard Infor BODs to load them into an Infor application.
  • Scripts are used for executing custom Python code on incoming documents. For example, scripts can be used to facilitate and solve common use cases such as data object format conversion, data mapping, complex calculations and transformations.
  • Parallel flows are used if a document must be sent to multiple connection points. Or if documents from multiple connection points must be sent to a single connection point.
  • Filters are used to limit the number of messages that are delivered to a connection point. For example, if only documents having a specific status are relevant for an application, you can filter out the documents having another status.
  • Content-based routings are used to send a document to a subset of available destinations based on the document contents. For example, if three warehouses exist, each having their own warehouse management system, a SalesOrder message can be routed to one of these warehouses. The warehouse code that is specified in the SalesOrder message is used. The execution of document flows is fully automatic. After a flow is defined and activated, the relevant documents are routed accordingly.

Data Lake flows

A Data Lake flow is a sequence of activities that results in sending data into Data Lake or sequence of activities starting with retrieval of data from Data Lake.

In a "simple" Data Lake flow, two connection points are involved. One Data Lake connection point sends a specific type of documents, and the other connection point receives those documents. Or a connection point sends a specific type of documents, and the Data Lake connection point receives those documents. Data Lake flows can be more complex.

Some activities are not allowed to be modeled in Data Lake Flow.

See Creating and using data flows.

Asynchronous activities are not allowed to be modeled in the middle of a Data Lake flow. Specifically these connection points:

  • Application (IMS)
  • Application (in-box/outbox)
  • LN
  • CRM Business Extension
  • File
  • API (Send, Read)
  • Database (Read, Send, Request/Reply)
  • Message Queue (JMS)

In addition to connection points, Data Lake flow can contain other items such as:

  • Mappings that are used to translate a document to another format.
  • Scripts that are used for executing custom Python code with a document input.
  • Parallel flows that are used when sending documents from multiple connection points to Data Lake.