Importing data to Expense Management - Upload_data.bat / Upload_data.sh

This section describes the batch files functionality and the associated configuration parameters.

Functionality:

The Upload_Data.bat file is used to import the data files from the customer site to the Expense Management database. Customers are recommended to send the data files in the UTF-8 format, particularly if the data is not in English. The data file that is to be uploaded does not have any fixed naming convention and must follow the same format as defined in the Extensity standard import / export mechanism documentation for the appropriate release. Using a single upload_data.bat run, multiple files can be uploaded to the respective database tables.

In most of the cases, as a default rule, the data files must be uploaded to the staging tables. To implement this rule, you can use alex_xmdbtool, which is an XMDBTool specific table in the Expense Management database. This table is explained in detail in the section alex_xmdbtool specification. The tables to which the data can be uploaded must be specified in the alex_xmdbtool table. By default, all the staging tables are specified such that the upload_data.bat file can upload data files to all the staging tables. However, in case of any special requirements, where the user must upload the data to any specific transactional table, you need to contact Customer Support to register that specific table name in the alex_xmdbtool table.

To ensure data integrity, CRC-32 Checksum is applied to the file that must be uploaded.

The results of the upload process are written to a log file on the client machine. The naming convention of the log file is, Upload_<TableName>.log. The TableName is the table name of the uploaded file. The log file location is the data file directory on the client machine, which allows you to track the log files related to each table easily. By default, the log data information for every run is appended to the log file. However, by using the parameter im_logMerge, you can have a new log file for each run. In this case, each log file generated have the date-time stamp appended as part of the log file name: Upload_<TableName>__<datetimestamp>.log

Configuration Parameters

Multiple parameters, some mandatory and some optional, can be defined for upload_data.bat/sh. All these parameters can be specified in the xmdbtool.properties file located in the <XMDBTool_Root_Install_Directory>/conf directory. You can also specify this as command line arguments to the batch files. This is optional.

The parameters available for this file:

  • uploadTableName: This is a mandatory parameter. In this parameter, users must specify the table name where the data file must be uploaded. For example, to upload a data file to the alex_application_filter table, the parameter must be specified as uploadTableName=alex_application_filter
  • uploadFileName: This is a mandatory parameter and is used to specify the path of the data file name that must be uploaded. For example, to upload a file app_filter.txt stored in the local machine to the alex_application_filter table, the uploadFileName parameter must be specified as uploadFileName = C:\\app_filter.txt.
Note: You must use the double backslash File Separators.
The XMDBTool client is enhanced to add the ‘-purgeStagingTables true’ option to the Upload_Data.bat or the Upload_Data.sh file. If the option is specified in the command line, the xmdbtool utility searches for the purgeStagingTableList in xmdbtool.properties file. The‘-purgeStagingTables true’ option can be combined with other options or used independently. For example:
  • If the purgeStagingTableList=alex_addressIalex_cost_center option is enabled in the xmdbtool.properties file and the Upload_Data.bat -purgeStagingTables true command is executed, the Upload_Data purges data from the alex_address and alex_cost_center tables.
  • If the purgeStagingTableList=alex_address option is enabled in the xmdbtool.properties file and the Upload_Data.bat -uploadTableName alex_address -uploadFileNamec:\temp\alex_address.txt -purgeStagingTables true command is executed, the Upload_Data purges data from the alex_address table and uploads the data to the table sequentially. The table name specified in the uploadTableName list does not require to match the purgeStagingTableList.

To upload multiple files

To upload multiple files, pass the <tableName = FileName> (Name-value) parameters in the uploadTableName parameters. Each <tablename=filename> pair entry must be separated by a comma (,). For example, uploadTableName =alex_application_filter= C:\\app_filter.txt,alex_receipt_data=C:\\rec.txt

When the upload_data.bat command is used to upload multiple data files, the uploadFileName parameter is not used.

  • purgeData: This is an optional parameter and is used to purge (delete) all existing data in the table specified by the uploadTableName parameter. Purging must be completed before loading the table with the data from the newly uploaded file. If the purgeData parameter is set to 1, the data in the table specified in the uploadTableName parameter is purged (deleted) before the data in the newly uploaded file is uploaded to the table. If the data in the table needs to be retained when inserting new data from the uploaded file, the purgeData must be set to 0 (or must be commented out by adding # in front of purgeData parameter). By default, the purgeData is always set to 0 and is also commented. For example, purgeData=0
  • im_delimiter: This is an optional parameter and is used to specify the file delimiters that are used in the data files that must be uploaded. For example, if the file delimiter in the data file is TAB, then the parameter is used as: im_delimiter = TAB. If the im_delimiter is not specified, by default, the file delimiter is always a bar.
  • chunkSize: This is an optional parameter and is used to allow uploading of large files. Customers can define the chunk size or the number of lines by defining the required value for this parameter. The default is 100000 lines. For example, if a large data file of about 200000 lines must be uploaded, the upload_data.bat splits the input file into two files with 100000 lines for a file. You can specify the number of lines required in the split files by specifying the chunkSize parameter in the xmdbtool.properties file. If the chunkSize parameter is not specified and the input data file is less than 100000 lines, then the data file is uploaded without splitting. If the data file is more than 100000 lines, then data file is split internally and uploaded as two different files to the server. This is a very useful parameter when uploading very large data files. By default, chunkSize=100000
  • im_logMerge: This is an optional Parameter. By default, the results of all upload processes are written to a log file on the client machine. The log file uses this naming convention Upload_<TableName>.log. Here the TableName is the table name of the uploaded file. The log file location is the data file directory on the client machine. This makes it easier to track the log files related to each table. The log data information for every run is appended to the log file. This is the default behavior. However, the im_logMerge parameter enables you to write a new log file for each run. In this case, each log file generated has the date and time stamp appended as part of the log file name, like Upload_<TableName>__<datetimestamp>.log. If the im_logMerge parameter is set to true, the log file generated is a single log file with log data for all the runs. If set to false, a new log file is generated for each run and these log files have the data-time stamp appended to the log file name. By default, im_logMerge is set to true.
  • Compress: This is an optional parameter and is used to define whether the data to be transferred must be compressed for faster data transfer and quicker operations. By default, compression is enabled for the upload process. Setting it to false disables the compression and this is not recommended. By default, compress=true.