Jump to: navigation, search

About Data Export Capability

Data Export capability is enabled in select Cloud deployments to periodically copy the data that is stored in the Genesys historical database (called the Info Mart database) into local .csv files, so that the data is available for further import into a data warehouse.

Data Export capability exports data from fact and dimension tables that are part of the Genesys Info Mart dimensional model and creates a .zip archive containing individual .csv files, one file per database table. The export does not include aggregate (RAA) tables or internal (GIDB_*) tables except for certain configuration tables, as listed below. The fact and dimension tables included in your specific data export depend on the details of your Genesys Cloud agreement. The following tables are available for export:

In addition to the data from the Genesys Info Mart dimensional model tables, configuration details data is exported from the following tables:


The output data files are encoded using the UTF8 format.


By default, the export runs at 00:20, 08:20, and 16:20 every day. While Genesys personnel can adjust the schedule as necessary for your Cloud deployment, the export schedule should not be any more frequent than every 30 minutes.

File/directory structure

The export is incremental and uses special audit keys to identify changes in data since the last export. At each export, a chunk of exported data is written into a separate folder that is named according to the following naming convention: export_XXX

where XXX consists of:

  • an audit key identifier (audit key high-water mark)
  • the maximum date of data contained in all previous exports and this export, in GMT time zone, written in the YYYY_MM_DD_HH_MI_SS format.

The output folder contains several .zip files, as follows:

  • export_XXX.zip — zip file with exported data. Each table is stored in a separate file with a file name in the format <table-name>.csv—for example, interaction_fact.csv. Within a .csv file, a header line identifies the table column names. Note that, within the exported .csv files, nulls and empty strings are represented as empty fields.
  • export_XXX.zip.sha1 — checksum for export_XXX.zip. The checksum can be validated by sha1sum program (https://en.wikipedia.org/wiki/Sha1sum) and is used to verify that the .zip file is complete on the receiving side.
  • export_XXX.extracted.xml — metadata about export_XXX.zip.
The subfolder .gim is reserved for internal use.

Checksums are also generated for each individual table .csv file. If a table does not have any changes since the last export, nothing is written for that table.

Export metadata file

The export_XXX.extracted.xml metadata file includes information about the export file, as shown in the example below.


<hwm-from audit-key="13" created-ts="1468259183"/>
<hwm-to audit-key="200074" created-ts="1468345485"/>


  • created-ts — the UTC timestamp, in seconds, since January 1, 1970, for the execution of the export
  • gim-schema-version — the version of the Info Mart database schema used by the tables
  • gim-version — the version of Genesys Info Mart Server that created the export files
  • hwm-from — the starting point of the data in the export by audit key and the create time, in UTC seconds, of that audit key
  • hwm-to — the ending point of the data in the export by audit key and the create time, in UTC seconds, of that audit key
  • max-data-ts — the maximum time, in UTC seconds, of the data contained in all previous exports and this export

The hwm-to and hwm-from values must match between successive export runs. Use them to verify that no intermediate export file has been missed on the receiving side. For example, the next export following the example .xml file above is supposed to have hwm-from audit-key = 200074.

The maximum time span of data in any single export file is one day. For example, if historical reporting was not available for two days (because, for instance, the server or database has been down), the export will continue from the last exported high-water mark and move ahead one day in the data. The next export will continue from there, exporting no more than one day at a time, until the export has caught up with the current data.


Genesys provides an SQL script, make_gim.sql, to assist you in creating a target schema into which to import the exported Info Mart data. If the Info Mart database schema version changes after you have set up your target database, you may need to update the target schema and change the import processing to accommodate the new database schema.

The exported table data typically contains a mix of created and updated rows. For this reason, you should merge newly exported data with existing data loaded from prior exports. For example, first, load the export files into a temporary table and then use an SQL merge statement based on the primary key for the table to merge the data into a permanent target table that holds the cumulative data from prior exports.

Process the export folders in order by folder name.

If necessary, you can restart the export data stream from the beginning or from a fixed date. Also, you can re-export a time span backwards from the most recent export.


Comment on this article:

blog comments powered by Disqus
This page was last modified on March 28, 2017, at 21:41.