Quick guide to Data Fabric
Sending data to Data Lake
You can send data to Data Lake in these ways:
- Batch Ingestion API, see Batch Ingestion API.
- Data Flows or Data Loader, see Infor ION.
Viewing data in Data Lake
You can view data in Data Lake with Atlas.
For information on how to use Atlas, see Viewing data in Data Lake.
Validating data in Data Lake
With Data Ledger, you can validate that data that is sent from a source application made it into Data Lake.
For information on how to use Data Ledger, see Validating data in Data Lake.
Extracting data from Data Lake
You can extract data from Data Lake in these ways:
- Data Lake Retrieval APIs
- Export data object from Atlas. In Atlas, select a data object, click the three-dot icon in the toolbar, and select .
Querying data from Data Lake
You can query data from Data Lake with these tools:
- Compass in the Data Fabric application. See Using the Compass application page.
- Compass APIs. See Using Compass query APIs.
- JDBC Driver. See Compass JDBC driver.
Deleting data from Data Lake
To delete data from Data Lake, see these sections:
- For details on deleting data with Data Lake APIs, see Using the Delete APIs.
- For details on deleting data in Atlas, see Deleting data objects in Atlas.
- For details on deleting data in Purge, see Working with Purge logs.
Using Metagraphs
With Metagraphs, you can view and define relationships between data objects that are stored in Data Lake. With Metagraphs, you can also generate a Data Lake view based on the information defined in the model. These views can then be queried through the Compass Query Platform.
See Metagraphs.
Enforcing data security
Before you switch on data security, ensure that you grant permissions to the users and applications that require and are entitled to use data from Data Lake.
By default, the DATAFABRIC-SuperAdmin security role has full access to all Data Lake objects and views and to all pages and features.
By default, the Infor-SystemAdministrator role has full access to all pages and features.