daimport - Data Area Data Import
Import data area data:
daimport -I InputDirectory | ZipFile dataarea[=newname]
[OPTIONS][dataarea[=newname] ...]
List files that would be imported:
daimport -l InputDirectory | ZipFile
Use this utility to import data into application (non-GEN) data areas.
Program Option | Description |
---|---|
-I InputDirectory | ZipFile
|
Specify the directory or zip file to import data from. |
-l InputDirectory | ZipFile
|
Specify a directory or zip file to examine to view a list of the files that would be imported. |
dataarea[=newname]
|
Specify the data area or data areas to import data into. Optionally, you can provide a different name for the data area. |
-e
|
Stop processing if a database error occurs. |
-o
|
Overwrite duplicate records in the application. |
-t
|
Number of database files to import at a time, normally 4. |
-u
|
Update duplicate records in the application. |
-w
|
Skip importing into application views. |
x transactionsize
|
Specify the transaction size, in number of records to be held in the buffer before committing or ending transactions. This specification will override any other specification for the INSERTBUFSIZE parameter made in the db.cfg file. |
-V
|
Validate only (does not check for duplicate records). |
--deletedata
|
Delete application data and data area specific environment data before import. |
--email
|
Replacement value for set EmailAddress fields. |
--sameenv
|
Use when importing to the exported environment. |
--errorfile errorFileName | --noerrorfile
|
If you do not specify If you specify If you specify |
--enverrorfile
|
Environment error file jar, normally daimportErr{importname}_ENV.jar. When you use multiple data areas,
importname will equal
jarname[,importname=jarname
...] |
--helpers
|
Allow helper threads for multiple threads loading a single table. Using this feature can potentially improve the performance of importing large tables. If you use dbimport with multiple threads and this option, when the number of tables remaining to be processed is fewer than the running threads, dbimport will use the now available threads to help in the import of the remaining tables.
|
--noenverrorfile
|
Do not create environment error files. |
--noconfigclear
|
Do not fire configuration clear event. Using this option means the system will be out of sync. This is useful when a system is off-line and these utilities are being used. In that case, the system will be restarted anyway and the cache for configuration data will be automatically built upon access so processing the event is not needed. |
--ignoredups
|
Ignore duplicate records. |
Options for CSV data: | |
-N
|
Normalize records; enforce strict data type casting. |
-F
|
Force import even if headers do not all match. |
-M nn
|
The threshold in megabytes used to determine
when to write large binary objects to a temporary file on disk during
the import. The default is 20MB. If the import fails or is likely
to fail because of memory issues from large binary objects, lower
this threshold. If your system has sufficient resources,
you can consider increasing the threshold in order to speed up the
import. Specify -1 to disable this option.
|