Jump to: navigation, search

Troubleshooting

This page describes troubleshooting steps for Billing Data Server (BDS).

Log

You can easily monitor the BDS application is by viewing log files on the Docker host. All logs are located under the ./cloudbilling-prem.local/log/ directory. The figure Sample main log file provides an example. Sample main log fine

There are three kind of logs:

  • Bds.log — the main log file, which contains log-records of daily BDS runs.
  • bds_stats.log — captures records with statistical information in key=value format.
  • brsctl.log, brs_config_snapshotter.log, db_utils.log, control_validation.log, premise_loader.log, and sbc_brs_comparator.log — captures log records of bds-utilities that are run manually.
  • bds-audit.log — captures records with a severity level of AUDIT (Available only in releases 9.0.000.1x and later).

The log file format is:

Date Time, Log Level, Thread ID | Module Name, Function Name - <Processing date, Tenant_id, Tenant name> Message

The possible log levels are:

  • AUDIT (Available in releases 9.0.000.18 and later)
  • CRITICAL
  • ERROR
  • WARNING
  • INFO
  • DEBUG

Genesys recommends that you monitor the logs for CRITICAL and ERROR level messages.

Exit codes

Beginning with release 9.0.004.01, BDS provides exit codes when the Brsctl.py and Dispatcher.py processes complete.

Brsctl.py exit codes

The BDS Control utility (Brsctl.py) returns exit codes when processing is complete, as described in the following table:

Code Description
Brsctl.py exit codes
0 success
1 failure

Dispatcher.py exit codes

The Dispatcher.py utility returns exit codes when extraction, transformation, and load (ETL) processes complete, as described in the following table:

Code Description
Dispatcher.py exit codes
0 success
1111 Error that does not include extraction/transformation/load
1000 extraction error
100 transformation error
10 load error

More than one error code can be stacked together (except for 1111). For example, 1010 indicates that both extraction and load failed (1000 + 10 = 1010).

File Storage

BDS stores extracted and transformed data locally, in directories defined by the following variables in the .env file:

  • local_cache
  • local_extract_path (in release 9.0.001.01 and earlier, this parameter was named premise_extract_path)
  • local_transform_path (in release 9.0.001.01 and earlier, this parameter was named premise_transform_path)

Directory structure for extracted data

Extracted data is stored locally within the directory identified by the “./local_cache/local_extract_path” parameter, in the .env file. Data within that directory is stored in subdirectories, as follows:

  • For Genesys Info Mart data sets, each data set produces one CSV file, each day, in subdirectories with the following naming:
    /<tenant_id>/<dataset_name>/<year>/<month> (MM)/<date_label>.csv.gz
  • For GVP Call Detail Record (CDR), there ais a seperate file for each location, in subdirectories with the following naming:
    /<tenant_id>/<dataset_name> (gvp_cdrs)/<region>/<location>/<year>/<month> (MM)/<day> (DD)/<date_label>.csv.gz

Directory structure for transformed data

Transformed data is stored locally within the directory identified by the “./local_cache/local_transform_path” parameter in the .env file. Data within that directory is stored in subdirectories, as follows:

  • For region-aware metrics:
    • Summary file:
      CNT_<US | EU | ...>_PES_<METRIC_NAME>_<GARNCODE_tenantID>_<datetime_with_timestamp>.CSV
    • Data file:
      <US | EU | ...>_PES_<METRIC_NAME>_<GARNCODE_tenantID>_<datetime_with_timestamp>.CSV
  • For global metrics:
    • Summary file:
      CNT_PES_<METRIC_NAME>_<GARNCODE_tenantID>_<datetime_with_timestamp>.CSV
    • Data file:
      PES_<METRIC_NAME>_<GARNCODE_tenantID>_<datetime_with_timestamp>.CSV

Timestamp encoding varies as follows:

  • All concurrent and non-concurrent files follow a naming convention wherein timestamp is T000000Z.
  • For all seats metrics, an additional file captures information about enabled seats count, and is generated daily. This file has the same name as the corresponding concurrent or non-concurrent daily file, but has a timestamp T000001Z.

For example:

  • US_PES_AGENT_EMAIL_1000_2016_10_20T000000Z.CSV
    Meaning: US region, premise True, seats_email metric, Tenant_ID (must be more complex, unique for each tenant), time with T000000Z timestamp - (Concurrent)
  • US_PES_AGENT_CHAT_1000_2016_10_11T000001Z.CSV
    Meaning: US region, premise True, seats_chat metric, Tenant_ID, time with T000001Z timestamp - (Enabled)
This page was last edited on July 13, 2022, at 20:07.
blog comments powered by Disqus