Genesys Voicebots (CE41) for PureConnect
What's the challenge?
When your customers call in to self-service or need an agent, they want to get off the phone as soon as possible. Traditional IVRs are complex menu mazes that are unfriendly to use and confuse customers. This leads to longer agent interactions and increases cost of service for an organization
What's the solution?
Deliver a smooth service experience for customers with a bot that intuitively understands customer issues in natural language. It improves contact center operations as agents do not need to intervene for repetitive issues.
Contents
Use Case Overview
Story and Business Context
Successful companies strive to provide good experiences for their customers, but doing so becomes challenging and expensive given the multitude of communications channels (voice and digital) that must be offered to interact with customers. It is increasingly expensive to serve customers with live agents and companies need to optimize efficiencies in cases when requests can be fully or partially automated and serve customers faster and more economically. Customer Service costs are sustainable for self-service channels like in-app, web self-service, and virtual assistant but they begin to increase with web chat and exponentially skyrocket with voice. The challenge to customer service leaders is to provide outstanding experiences for customers by utilizing automation and artificial intelligence blended with human interaction where required.
Blending bots technology with voice, self-service and digital applications enables companies to optimize service without incurring costs associated with live customer interactions. Designing customer experiences with voice bots delivers the following:
- Ability to manage the customer entire journey, automated wherever possible, human touch when needed, for example, chatbots, voicebots, and Altocloud
- Create personalized bot experiences by sharing context across voice and digital channels and with agents
- Voicebots in general provide significantly higher benefits than a traditional IVR due to advancements in speech recognition technology and the introduction of Conversational AI which promotes a more natural interaction providing a high-quality customer experience
- Quickly and easily deploy omnichannel bots with Natural Language Understanding (NLU) and pre-built microapps that leverage industry best practice
- Move to cloud at your own speed and at low risk with cloud, hybrid, and premise options
- Inclusion of voicebots powered by Natural Language Understanding from Google DialogFlow and Amazon Lex to provide natural and rich conversational experiences for customers who call Customer Service. Spoken phrases from customers are transcribed to text by Google Cloud Speech to Text which is then interpreted by the chosen NLU application to identify the customer’s intent. For example, book a flight; extract key data for the query such as origin and destination city, date, day, time; and generate an appropriate response and/or further questions to continue the interaction.
Use Case Benefits*
The following benefits are based on benchmark information captured from Genesys customers and may vary based on industry, lines of business or Genesys product line:
Use Case Benefits | Explanation |
---|---|
Improved Containment Rate | Increase self-service interactions results in reduced agent-assisted interactions for repetitive or common requests. |
Improved Customer Experience | Reduce time to serve, which in turn improves Net Promoter Score (NPS). Improve self-service containment rates, handle time, and overall customer experience by enabling routing to the right self-service application by correctly identifying intent. |
Improved First Contact Resolution | Present a customer experience that is tailored to the individual based on who they are, why they might be interacting, and the status of the contact center. |
Reduced IT Operational Costs | Reuse of existing assets and the option to use less expensive speech alternatives |
Summary
Genesys supports a “design once, deploy anywhere” concept for AI-driven bots to enable organizations to provide a seamless customer experience across voice and digital channels. This use case focuses on deploying a bot on the voice channel and can automatically be extended to digital channels by adding the Genesys Chatbots with Blended AI Use Case. The voice bot supports and orchestrates the following capabilities:
- Personalization – to tailor the experience based on context from the current interaction or from previous interactions Natural Language Understanding (NLU) – to derive intents and entities
- Intent classification utilizes NLU to classify the customer's intent with high accuracy, and connect the customer to the correct self-service process or agent. Also NLU derives entities, which are pieces of relevant information from customer utterances such as: account numbers, amounts, and other parameters that are returned to complete a process corresponding to the intent thus enabling a conversational experience.
- Identification & Verification (ID&V) – to identify and verify the customer if necessary
- Directed Dialog – to automate relevant and structured business processes or provide information
- Involve another NLU/AI platform – if it specializes in a particular topic
- Hand off to a voice agent – to connect the customer to a live person with the full context of the interaction
- Offer a callback - if outside of business hours or long wait time then the voicebot offers a callback from the voice channel
- Offer a voicebot survey depending on business context
Use Case Definition
Business Flow
Business Flow Description
- A voice interaction is initiated (reactive or proactive) on the voice channel.
- The customer hears a standard welcome message from the voicebot.
- Customer information and/or context is retrieved from:
- Customer profile information stored in PureConnect
- Genesys User Data
- API call to third-party data source
- The customer hears a personalized message or menu or is handed over to a voice agent. Examples include:
- Custom voice message or update, such as "Your next order is due to be delivered on Thursday before 12"
- Most likely contact reason, such as "Do you want to find out about the loan application you have in progress?"
- Customer is provided personalized menu options.
- Customer is routed over directly to a voice agent. For example, if they have an overdue payment or a callback is scheduled. If the customer is not handed over to a voice agent, then at this point the customer could end their call, confirm the contact reason, or continue.
- Once the customer has moved on from the Personalization stage, the voicebot asks an open-ended question like: “How may I help you?” to determine intent (and entities) and capture the customer's response.
- The customer speaks to the voicebot in natural language. For example, the customer could say “I missed a bill payment and want to check my balance because I think I'm in arrears”
- The voicebot listens to the customer and converts the raw spoken utterance into text.
- The voicebot then sends the customer's response to the configured NLU engine. If intent and entities are returned, the conversation moves to the correct point in the interaction flow, which could be within one of the following subflows:
- Identification and Verification (ID&V)
- Automated business process (such as Payment Collection microapp)
- Handoff to live voice agent or schedule a callback (see the relevant use case for the channel).
- If intent and entities are not returned and the organization has Genesys Knowledge Center (GKC) or of third-party Knowledge Base, the system passes the raw textual transcription to be Knowledge Base to look for a result using the Knowledge Base API. If a relevant knowledge article is found, the results are played back to the customer and the customer moves to the next step. If a relevant knowledge article is not found, the voicebot plays a prompt like: "Sorry, I didn’t find any results. Re-phrase or say AGENT for live assistance." If intent and entities are not returned and the customer does not have a Knowledge Base, the voicebot plays a prompt such as: "Sorry, I didn’t understand your question. Please ask me another question or say AGENT for live assistance."
- Fall back to DTMF is an optional capability that can be quoted by Professional Services separately as it is not included in the scope of this use case.
- Upon completion of a task, the voicebot asks a follow-up question like: "Is there anything else I can help you with?"
- If the customer says something like “yes”, they're brought back to Step 5: "How may I help you?”
- If the customer says something like “no”, the voicebot decides whether to offer them a survey (see the next step). If the customer responds with a more advanced answer, the NLU Engine determines intent and entities for further processing.
- Customer information and/or context is retrieved from the following sources to determine whether to offer a survey which is available within Genesys Intelligent Automation only (steps 8–10):
- Customer profile information stored in PureConnect
- Genesys User Data
- API call to third-party data source
- logic defined in Intelligent Automation
- If a survey is to be offered, the voicebot continues to the next step.
- If no survey is to be offered, the voicebot shows a goodbye message and ends.
- The voicebot asks the customer:"Would you like to participate in our survey?"
- If the customer answers something like "yes", then they continue to the next step and engage in a survey.
- If the customer answers something like "no", then the customer is shown a goodbye message and the voicebot ends.
- Optional: If the survey results meet a certain criteria based on the configured evaluation parameters, a specific action can be taken. For example, if the customer provides a negative response, they can be routed to a live agent.
Business and Distribution Logic
Business Logic
BL1: Agent Handoff: The customer can ask to be connected to an available voice agent. Other context can also be displayed as Case Data.
BL2: Retries: The number of retries for self-service tasks and questions can be configured by a business user. Upon, maximum retries the dialog can be configured to play a message, hand off to a voice agent, or offer a callback if busy or outside of business hours.
BL3: Response Type: The interaction flows can be configured to accept natural language spoken responses and closed spoken responses such as account number, date of birth, and yes/no questions.
BL4: Callback: If outside of business hours, or estimated wait time (EWT) is high, the voicebot can offer a callback. If this option is not included, then a message is played back to the caller that a transfer is not possible.
BL5: Survey: The customer can determine whether to address a survey or not. Surveys can be based on:
- Customer profile information stored in PureConnect
- Genesys User Data
- API call to third-party data source
Parameters Influencing VoiceBot Behavior
This Use Case is supported across industry verticals. The basic features of voicebot business logic such as personalization are parametrized. Example parameters include:
- Personalization
- Segmentation, offer management, characteristics, and most likely contact reason
- Intents - the goal of the interaction - for example, a "pay_bill" intent returned by NLU Engine would indicate that the customer should be presented with an authentication business process followed by a payment business process
- Entities - more pieces of key information returned by NLU Engine. Entities can accelerate the conversation by pre-populating answers to subsequent questions
- Confidence levels
Agent Handoff:
- Based on user choice, such as "I want to speak to an advisor"
- Based on default handling, such as retries, timeouts, global commands
- Based on application logic, such as customer with an outstanding debt and application decides to transfer
Callback
- Estimated wait time to determine whether to offer callback
Survey
- Based on context from PureConnect, Intelligent Automation, Genesys User Data, or third-party web service
Distribution Logic
When the conversation is handed over to a live agent, the interaction moves to one of these use cases, depending on the channel the customer is using - see the Use Case Interdependencies section for related use cases.
User Interface & Reporting
Agent UI
N/A
Reporting
Real-time Reporting
Key real-time metrics that can be included in a report are:
- Current interactions waiting in the system
- Total interactions
- Agent Group Status
Historical Reporting
- How many conversations took place over a period
- Length of time for each conversation: maximum/minimum/average
Customer-facing Considerations
Interdependencies
All required, alternate, and optional use cases are listed here, as well as any exceptions.
All of the following required: | At least one of the following required: | Optional | Exceptions |
---|---|---|---|
Inbound |
None | None | None |
General Assumptions
- Use Case supports voice channel. Genesys Chatbots (CE31) for PureConnect supports digital channels.
- This Use Case is supported by industry templates that contain examples of voicebot applications combining personalization, natural language understanding, AI, and microapps. Voicebot application requirements including required microapps will be confirmed during design. These application templates are created for Financial Services, Telco, and Travel.
- Handoff to voice agent is on the same channel.
- To access a 3rd party NLU, it must be routed through Intelligent Automation.
- NLU capabilities for non-English languages can be supported through third-party NLU engines such as Google DialogFlow. Integration to third-party NLU/AI engines is a customization task. Dialogs that do not require NLU can support any language by using personas.
- Integration to Knowledge Center is a customization task; an example integration code snippet can be provided.
- Callback dialog flow is provided by the Intelligent Automation Smart Transfer microapp.
- Chat transcript is not passed to callback agent.
- Survey dialog flow is provided by the Intelligent Automation Questionnaire Builder microapp. Results available for download from the Intelligent Automation Control Center or via web service.
- More infrastructure is required to support UniMRCP and Google ASR.
- The Intelligent Automation Control Center to configure voicebots is localized to support the following languages:
- English (United Kingdom)
- French
- Spanish (Mexican)
- German
- The IVR transcript will not pass.
- The voicebot is deployed in a Hybrid model, that is, Genesys PureConnect infrastructure on premise with Google components in the cloud.
- Google ASR is connected through a universal UniMRCP connector that supports voicebots through Intelligent Automation, if a customer needs voice-based ASR they require Nuance.
- There is an extra lead time for cloud deployment for a customer-specific infrastructure plan to be created. Cloud Operations supports many customers and a new deployment enters into the queue with other requests.
- Dialog Engine is not available for PureConnect. It is only available on Genesys Cloud CX.
Customer Responsibilities
- Voicebot configuration and settings will be quoted as part of a Professional Services engagement to capture requirements and business logic.
Document Version
- Version 1.0.2 last updated November 9, 2021