As of Friday October 16th, access to restricted content on this site now requires you to log in with your Genesys account. If you don't have an account, you can request one at these locations: Request MyPartner Portal Account or Request My Support Account.
Jump to: navigation, search

Create a Calibration Evaluation

This content may not be the latest Genesys Engage cloud content. To find the latest content, go to Recording in Genesys Engage cloud.

The SpeechMiner UI Quality Management - Calibration Evaluation creates evaluation sessions that can be used to compare evaluator performance, to ensure consistency across teams. A Calibration Evaluation is performed on one evaluation in the same way as a Distributed Interaction Evaluation session, the difference is that the result of these evaluation sessions can be used in a Calibration Score report (that is, a report that compares how the evaluators filled out the same evaluation session).

You cannot create an evaluation without a form. Before you create an evaluation, verify that a form(s) was created.

Create a Calibration evaluation

  1. Click Quality > Evaluations Manager. The Evaluations Manager grid appears.
  2. Click the drop down arrow next to New Evaluation and select Calibration.
  3. In the Untitled Evaluation field, enter the name of your evaluation.
  4. In the Session Expires after field, enter the time (Days, Weeks or Months) by which the evaluation session should be performed after the evaluation is created.
  5. Select the Forms tab and from the Forms list, select the forms to add to the evaluation. As soon as you select a form from the list, it appears in the Evaluation Summary under Forms. Select the form again to remove the form from the Evaluation Summary. The forms selected consist of the questions that must be answered to complete the evaluation session.
  6. Select the Evaluators tab and from the Evaluators list, select the evaluator(s) that should perform the evaluation session(s). As soon as you select at least two evaluators, the evaluator name appears in the Evaluation Summary under Evaluators. Select the evaluator again to remove the evaluator from the Evaluation Summary. Each evaluator will receive an evaluation session to evaluate the agent(s) associated with the selected interactions. When you select more than one evaluator, the resulting Evaluation Sessions are evenly distributed across evaluators in Round Robin fashion.
  7. Select the Interactions tab, and configure the evaluation filter options on the left side of the screen to generate a list of interactions and click Search
    A Calibration evaluation has a One Time schedule. It creates an evaluation that only produces one evaluation session. A Calibration evaluation cannot be a Recurring evaluation

    Your filter selections should be directly related to the business issue for which you want to evaluate an agent(s). For additional information, see Search Filter.

    1. Select one interaction. If you select more than one interaction, the Calibration Evaluation session will be not activated. As soon as you select an interaction from the list, it appears in the Evaluation Summary under Interactions. Clear an interaction checkbox to remove the specific interaction from the Evaluation Summary.
      If you select more than one interaction, the Calibration Evaluation session will not be activated.
  8. Click the Inactive option Inactive.png to activate Activate.png the evaluation. Only an active evaluation will create evaluation sessions.
  9. Click Save and Activate. Once the evaluation is saved and activated, it will start to create evaluation sessions according to the schedule configured at the beginning of this procedure.
The Evaluation Session list will automatically refresh when a new set of evaluation sessions is available. To manually refresh the Evaluation Session list, click the Refresh button Refreshgrid.png.


This page was last edited on October 2, 2020, at 12:39.


Comment on this article:

blog comments powered by Disqus