Natural Language AI Integration
Intelligent Automation can integrate with external artificial intelligence (AI) services to provide natural language understanding (NLU) capabilities.
When Intelligent Automation connects to a natural language engine, Intelligent Automation allows a user to use natural language when responding, even entering multiple pieces of information in a single response. The natural language engine infers the relevant information contained in a user's response, enabling some subsequent Menu and Question blocks in the call flow to be skipped, because the information has already been captured. The result is a successful conversation that is shorter than that of the directed-dialog approach.
How it works
The natural language engine works in the background while Intelligent Automation communicates directly with the user. Here's how:
- Intelligent Automation asks an open-ended question (For example, What can I help you with?)
- The user provides a natural language response (For example, What will the weather be like in London on Saturday?).
- Intelligent Automation receives the response and passes it along to the natural language engine.
- The natural language engine responds, as follows:
- If the natural language engine needs more information to understand the request, it responds to Intelligent Automation with follow-up questions. (For example, For what city?).
- If the natural language engine has all the information it needs, it sends Intelligent Automation the Intent and associated Slots (In the Weather example, the Intent could be Weather and the Slots could be weatherLocation and weatherDate).
- When Intelligent Automation receives and reads the Intents and Slots, it directs the interaction according to the configured application.
The following demonstrates a chat session with a natural language engine running in the background:
Intelligent Automation currently has built-in connections to Genesys Dialog Engine and Google Dialogflow (the v2 API) . If you want to use other natural language AI services, contact your Genesys representative for help with a custom integration.
How to integrate with natural language AI services
To integrate with a natural language AI service, complete the following steps:
- Configure Default Server Settings (applies to Genesys Dialog Engine only)
- Configure NLU Settings
- Map Intents to modules
- Map Slots to questions
Configure Default Server Settings
If you're using Genesys Dialog Engine, set the Default Server Settings outlined below. Otherwise, skip this step.
- DialogEngine.JOPv2.BaseURL - The URL to the Dialog Engine API.
- DialogEngine.JOPv2.Password - The Dialog Engine Client Secret password.
- DialogEngine.JOPv2.TimeoutMillis - The length of time (in milliseconds) that Intelligent Automation waits for a response from Dialog Engine before throwing an error.
- DialogEngine.JOPv2.Username - The Dialog Engine Client ID.
If you're using PureCloud, set the Default Server Settings outlined below. Otherwise, skip this step.
- DialogEngine.JOPv3.BaseURL - The URL to the Dialog Engine API.
- DialogEngine.JOPv3.AuthBaseURL - The URL to the PureCloud Authentication API.
- DialogEngine.JOPv3.FetchTimeoutMillis - The length of time (in milliseconds) that Intelligent Automation waits for a response from Dialog Engine before throwing an error.
- DialogEngine.JOPv3.ClientID - The Dialog Engine Client Secret password.
- DialogEngine.JOPv3.ClientSecret - The Dialog Engine Client Secret password.
Configure NLU Settings
Once you have configured the Default Server Settings, open a Natural Language Menu module and click the NLU Settings tab.
From the NLU Engine menu, select the natural language engine you're using - either Genesys Dialog Engine or Google Dialogflow.
- If you select Genesys Dialog Engine, the Client ID and Client Secret password configured on the Default Server Settings page will display. Leave the Use default credentials box checked to use these credentials or uncheck the box to override them for this particular application.
- If you select Google Dialogflow, you'll need to enter the Google API Service Account JSON. Google provides this JSON when you set up your Google services account.
Support for Proxies
To use Google DialogFlow through a proxy, create an environment variable called GRPC_PROXY_EXP and set the value in the host:port format.
Map Intents to modules
Intelligent Automation reads all Intents, Utterances, and Slots associated with a domain and then displays that information on the Intents List page for the Natural Language Menu module. This is where you'll map each Intent to a module.
To map an Intent to a module, simply select a module from the Intelligent Automation Module to Trigger menu for the Intent. Intelligent Automation will then automatically update the application's callflow accordingly.
Map Slots to questions
In the natural language engine, an Intent contains Slots, which are key pieces of information you need to extract from the end user to process a request. For example, if a user wants the weather forecast, you would need to know two key pieces of information before providing a weather forecast - city and date. These would be considered Slots.
For each Intent, you should have an associated module containing questions that will extract the right information from the customer. For the Weather example, you would have a Weather Questions module, which contains two questions: For what city and For what date? Both of these questions relate to the weatherLocation and weatherDate Slots.
Once you have created this module in Intelligent Automation, you need to map it to its associated Slots, as follows:
- Open the module that is linked to Intent.
- For each Question block in that module for which you would expect to have a slot, open the block and go to Question Options.
- Check the Store Answer as a Variable checkbox and enter the Slot name (weatherLocation).
When a chat or voice session reaches one of those Question blocks, it first checks if that slot has been sent to Intelligent Automation from the natural language engine. In that case, Intelligent Automation uses that as the answer to the question. Otherwise, Intelligent Automation asks the question and waits for a response.