Modeler toolbox
The modeler toolbox contains these categories of items:
- Core activities - to send, receive documents.
- Data Lake activities - to ingest documents to or retrieve documents from Data Lake.
- Supporting activities - to set parameters.
- Flows - to route or filter documents.
Overview
For consistency and to distinguish connection point activities from other activities in ION, the activities colors in the toolbox were updated. The colors relate to the activity type in the desk data flow and workflow modelers.
This list shows the colors that are used for which activities:
- ORANGE for core activities in:
Workflows: User interaction, for example, task, notification etc.Data flows: Connections, for example, connection points.
-
RED for Data Lake activities in:
Data Lake flows: Ingest, Retrieve, Query
- GREEN for supporting activities in:
Document flow: Mapping, splitting, workflow.
Document flow: Mapping, scripting, splitting, workflow.
Data Lake flows: Mapping
Data Lake flows: Mapping, scripting
Workflow: System activities such as set parameter.
- BLUE for flow activities in:
Document flow: Filter, parallel, routing, merge.
Data Lake flows: Parallel, merge
Data flows: Connections,Workflow: Filter, parallel, loop back, sub-process.
- ION API activity is green in the workflow modeler, similar to set parameter, exit point etc. In the workflow modeler, the core activities are user interactions such as tasks and notifications.
- ION API activity is orange in the data flow modeler, similar to other connection points. In the data flow modeler, the core activities are application interactions such as connection points.
Data Flow Modeler - Activities
This table shows the toolbox icons for the Activities category:
Name | Widget | Description |
---|---|---|
Application | Activity from an application, which sends or receives
messages. For this activity, you must create a new application
connection point or select an existing application connection point.
In case of multiple application instances that have the same role in the business process, you can select multiple application connection points. For example, when you have multiple instances of a warehouse management system, one per warehouse, that must receive the same master data documents. |
|
Network |
Activity to send messages to or receive messages from the different tenant. To use this activity, you must be provisioned with a 'Network' connection point. Note: Not available in Data Lake flows and when no Network
connection point is provisioned.
|
|
Database | Activity to read or write a database. For this activity, you must create a new database connection point or select an existing database connection point. See Infor ION Technology Connectors Administration Guide |
|
File | Activity to read or write text files. For this activity, you must create a new File connection point or select an existing File connection point. See Infor ION Technology Connectors Administration Guide. |
|
ION API | Activity to Send to API, Get from API or Trigger API
call. For this activity, you must create a new ION API connection point or select an existing ION API connection point. See Infor ION Technology Connectors Administration Guide. |
|
Message Queue | Activity to read or write a message queue. For this activity, you must create a new message queue connection point or select an existing message queue connection point. See Infor ION Technology Connectors Administration Guide. |
|
Web Service | Activity to use documents from a web service or send documents
to a web service. For this activity, you must create a new web service connection point or select an existing web service connection point. Note: Not available in Data Lake flows.
See Infor ION Technology Connectors Administration Guide. |
|
Stream |
Activity to send documents to the Amazon Kinesis Data Streams. For this activity, you must create a new Stream connection point or select an existing Stream connection point. See Infor ION Technology Connectors Administration Guide. This activity is only available for customers having an Enterprise license. |
|
Retrieve |
Activity to schedule incremental retrieval of documents from Data Lake. Note: Available only in Data Lake
flows.
|
|
Query | Activity to schedule specific query for receiving custom subset of your data from Data Lake. Note:
Available only in Data Lake
flows.
|
|
Ingest |
Activity to ingest all incoming documents to Data Lake. Note: Available only in Data Lake
flows.
|
|
Mapping | Activity to map a document to another type of
document, or to change the data in the document. For this activity, you must create a new mapping or select an existing mapping. See Document mappings. |
|
Scripting | Activity to run scripts with custom Python code on
incoming documents. For this activity, you must create a new script or select an existing script. See "ION Scripting". |
|
Workflow | Activity to start a workflow. In
this activity you select a workflow and link the input and output
parameters of that workflow to attributes of the incoming document.
When a document arrives at this activity, a workflow is started. When the workflow is completed, the output of the workflow is added to the document and the document flow continues. Note: Not available in Data Lake flows.
|
|
Splitter | Activity to split one document containing multiple
instances of the same object into separate documents. Each document
contains one object together with a header and footer Note: Not available in Data Lake flows.
|
Data Flow Modeler - Flows
This table shows the toolbox icons for the Flows category:
Name | Widget | Description |
---|---|---|
Parallel | Flow to use multiple activities in parallel. Use a parallel flow in these situations:
Do not define properties for a parallel flow. You can add or remove branches by right-clicking the diamond shape. |
|
Filter | Flow to limit the number of messages that are delivered to a
connection point. To define a filter, select one or more properties from the involved document and build a condition using these properties. See Filtering and content-based routing. A filter can also be used if the previous activity provides multiple document types. In that case a filter is defined per document type. Not all document types require filtering. Note: Not available in Data Lake flows.
|
|
Routing | Flow to send a document to a subset of available destinations
based on the document contents. To define the routing, select one or
more properties from the involved document and build conditions
using these properties. For each branch a condition must be defined. See Filtering and content-based routing. You can add or remove branches by right-clicking the diamond shape. A routing flow can be used if the previous activity only provides a single document type. Note: Not available in Data Lake flows.
|
|
Merge |
Flow to split input document into two parallel branches, trigger activities in the top branch and merge resulting "patch" document with the input document. There are two merge types:
|