Modeler toolbox

The modeler toolbox contains these categories of items:

  • Core activities - to send, receive documents.
  • Data Lake activities - to ingest documents to or retrieve documents from Data Lake.
  • Supporting activities - to set parameters.
  • Flows - to route or filter documents.
Note: Only the subset of activities and flows are used in Data Lake flow modeler toolbox.

Overview

For consistency and to distinguish connection point activities from other activities in ION, the activities colors in the toolbox were updated. The colors relate to the activity type in the desk data flow and workflow modelers.

This list shows the colors that are used for which activities:

  • ORANGE for core activities in:

    Workflows: User interaction, for example, task, notification etc.Data flows: Connections, for example, connection points.

  • RED for Data Lake activities in:

    Data Lake flows: Ingest, Retrieve, Query

  • GREEN for supporting activities in:

    Document flow: Mapping, splitting, workflow.

    Document flow: Mapping, scripting, splitting, workflow.

    Data Lake flows: Mapping

    Data Lake flows: Mapping, scripting

    Workflow: System activities such as set parameter.

  • BLUE for flow activities in:

    Document flow: Filter, parallel, routing, merge.

    Data Lake flows: Parallel, merge

    Data flows: Connections,Workflow: Filter, parallel, loop back, sub-process.

  • ION API activity is green in the workflow modeler, similar to set parameter, exit point etc. In the workflow modeler, the core activities are user interactions such as tasks and notifications.
  • ION API activity is orange in the data flow modeler, similar to other connection points. In the data flow modeler, the core activities are application interactions such as connection points.

Data Flow Modeler - Activities

This table shows the toolbox icons for the Activities category:

Name Widget Description
Applicationnt_ico_application_orangeActivity from an application, which sends or receives messages. For this activity, you must create a new application connection point or select an existing application connection point.

In case of multiple application instances that have the same role in the business process, you can select multiple application connection points. For example, when you have multiple instances of a warehouse management system, one per warehouse, that must receive the same master data documents.

Network nt_ico_network

Activity to send messages to or receive messages from the different tenant.

To use this activity, you must be provisioned with a 'Network' connection point.

Note: Not available in Data Lake flows and when no Network connection point is provisioned.
Databasent_ico_db_orange Activity to read or write a database.

For this activity, you must create a new database connection point or select an existing database connection point.

See Infor ION Technology Connectors Administration Guide

Filent_ico_file_orange Activity to read or write text files.

For this activity, you must create a new File connection point or select an existing File connection point.

See Infor ION Technology Connectors Administration Guide.

ION API nt_ico_ion_api.png Activity to Send to API, Get from API or Trigger API call.

For this activity, you must create a new ION API connection point or select an existing ION API connection point.

See Infor ION Technology Connectors Administration Guide.

Message Queuent_ico_message_queue Activity to read or write a message queue.

For this activity, you must create a new message queue connection point or select an existing message queue connection point.

See Infor ION Technology Connectors Administration Guide.

Web Servicent_ico_web_service_orangeActivity to use documents from a web service or send documents to a web service.

For this activity, you must create a new web service connection point or select an existing web service connection point.

Note: Not available in Data Lake flows.

See Infor ION Technology Connectors Administration Guide.

Stream nt_ico_stream

Activity to send documents to the Amazon Kinesis Data Streams.

For this activity, you must create a new Stream connection point or select an existing Stream connection point.

See Infor ION Technology Connectors Administration Guide.

This activity is only available for customers having an Enterprise license.

Retrieve nt_icon_retrieve

Activity to schedule incremental retrieval of documents from Data Lake.

Note:  Available only in Data Lake flows.
Query nt_ico_query

Activity to schedule specific query for receiving custom subset of your data from Data Lake.

Note:  Available only in Data Lake flows.
Ingest nt_ico_ingest

Activity to ingest all incoming documents to Data Lake.

Note:  Available only in Data Lake flows.
Mapping nt_ico_mapping_activity Activity to map a document to another type of document, or to change the data in the document.

For this activity, you must create a new mapping or select an existing mapping. See Document mappings.

Scripting nt_ico_ion_scripting_in_oneview Activity to run scripts with custom Python code on incoming documents.

For this activity, you must create a new script or select an existing script.

See "ION Scripting".

Workflow nt_ico_workflow_activity_sml Activity to start a workflow. In this activity you select a workflow and link the input and output parameters of that workflow to attributes of the incoming document.

When a document arrives at this activity, a workflow is started. When the workflow is completed, the output of the workflow is added to the document and the document flow continues.

Note: Not available in Data Lake flows.
Splitter nt_ico_ion_split Activity to split one document containing multiple instances of the same object into separate documents. Each document contains one object together with a header and footer
Note: Not available in Data Lake flows.

Data Flow Modeler - Flows

This table shows the toolbox icons for the Flows category:

Name Widget Description
Parallelnt_ico_parallel-flowFlow to use multiple activities in parallel. Use a parallel flow in these situations:
  • A document is sent to multiple connection points.
  • Documents are sent from multiple connection points to a single connection point.

Do not define properties for a parallel flow. You can add or remove branches by right-clicking the diamond shape.

Filternt_ico_filter-flowFlow to limit the number of messages that are delivered to a connection point.

To define a filter, select one or more properties from the involved document and build a condition using these properties.

See Filtering and content-based routing.

A filter can also be used if the previous activity provides multiple document types. In that case a filter is defined per document type. Not all document types require filtering.

Note: Not available in Data Lake flows.
Routingnt_ico_routing-flowFlow to send a document to a subset of available destinations based on the document contents. To define the routing, select one or more properties from the involved document and build conditions using these properties.

For each branch a condition must be defined.

See Filtering and content-based routing.

You can add or remove branches by right-clicking the diamond shape. A routing flow can be used if the previous activity only provides a single document type.

Note: Not available in Data Lake flows.
Merge nt_ico_doc_merge_rgt

Flow to split input document into two parallel branches, trigger activities in the top branch and merge resulting "patch" document with the input document.

There are two merge types:

  • Basic enrichment of the input document

    To define the basic enrichment, select one or more properties from the patch document and define where it should be placed in the output document.

  • Advanced merge using script

    To define the advanced merge select inputs from patch or input document to be sent to the script step. And select which script parameter contains output document and output document headers.