Non-SQL scripts to automate data loads from Data Lake to Relational modeling

You can create non-SQL scripts to automate loading data into the Relational modeling tables, Those tables are the data source for the OLAP cubes.

For example, based on the example from the "Best practices to load data from the source system to Infor d/EPM OLAP" topic, you can create these non-SQL scripts:

  • CA_FullLoad, which loads all dimensions and fact data to a cube:
    --<<NON-SQL>>
    -- Loading to Staging from the Data Lake with deleting the tables first
    ExecuteLoadQuery("CA_DimAdventurer", "FullLoad", TRUE);
    ExecuteLoadQuery("CA_DimCoffee", "FullLoad", TRUE);
    ExecuteLoadQuery("CA_DimMeasure", "FullLoad", TRUE);
    ExecuteLoadQuery("CA_FactCoffeeAdventure", "FullLoad", TRUE);
    -- Loading hierarchies
    ExecuteHierarchyMapping("demo", "CA_Adventurer");
    ExecuteHierarchyMapping("demo", "CA_Coffee");
    ExecuteHierarchyMapping("demo", "CA_Measure");
    ExecuteHierarchyMapping("demo", "CA_Scenario");
    -- The cube load (full)
    ExecuteFactMapping("demo", "CA_CoffeeConsumption_FullLoad");
    
  • CA_IncrementalLoad, which loads new fact data from the fact table with the _temp suffix::
    --<<NON-SQL>>
    -- Loading to Staging from the Data Lake
    ExecuteLoadQuery("CA_DimAdventurer", "IncrementalLoad", FALSE);
    ExecuteLoadQuery("CA_DimCoffee", "IncrementalLoad", FALSE);
    ExecuteLoadQuery("CA_DimMeasure", "IncrementalLoad", FALSE);
    ExecuteLoadQuery("CA_FactCoffeeAdventure_temp", "IncrementalLoad", TRUE);
    -- Loading hierarchies
    ExecuteHierarchyMapping("demo", "CA_Adventurer");
    ExecuteHierarchyMapping("demo", "CA_Coffee");
    ExecuteHierarchyMapping("demo", "CA_Measure");
    ExecuteHierarchyMapping("demo", "CA_Scenario");
    -- The cube load (incremental)
    ExecuteFactMapping("demo", "CA_CoffeeConsumption_IncrementalLoad");
    
  • CA_AdventurerReload, which loads data selectively, for example, only for a specific adventurer:
    --<<NON-SQL>>
    
    -- Loading to Staging from the Data Lake
    ExecuteLoadQuery("CA_FactCoffeeAdventure_temp", "AdventurerLoad", TRUE);
    
    -- The cube load (Adventurer Reload)
    ExecuteFactMapping("demo", "CA_CoffeeConsumption_AdventurerReload");