Quick Start

From Genesys Documentation
Jump to: navigation, search

To use Genesys Predictive Routing (GPR), you'll need to install and configure the products and components listed below.

Related documentation:


  1. Install any Genesys components that aren't already part of your environment. You'll need:
  2. Interaction Concentrator
  3. Genesys Info Mart
  4. Install Genesys Predictive Routing, which consists of the following components:
    • Data Loader—Imports agent and interaction data automatically from the Genesys Info Mart Database. Uploads customer and outcome data from user-prepared CSV files.
      • Data Loader is deployed in a Docker container and requires that you deploy a supported version of Docker.
    • URS Strategy Subroutines


To complete your setup of Predictive Routing, configure the following components:

  1. Set the desired values for the configuration options, as described in the configuration instructions for the URS Strategy Subroutines component and Configure Data Loader to upload data. The complete list of all Predictive Routing configuration options is available from the Genesys Predictive Routing Options Reference.
    You use configuration options to configure a wide range of application behavior, including:
    • The mode GPR is running in, which might be off.
    • Schemas and other configuration for Data Loader and data uploads.
    • Login parameters and access URLs.
    • KPI criteria to decide what makes for a better match.
    • Scoring thresholds, agent hold-out, and dynamic interaction priority.
    • Many other important functions.
  2. (STAFF users only) To configure how the match between interactions and agents is determined, follow the procedures accessed from the Create and Train Predictors and Models page, located in the Predictive Routing Help.

Review your data

Data Loader draws interaction and agent data from the Genesys Info Mart Database.

Using Data Loader, you can also import agent, customer, and outcome data compiled in .csv format. The GPR web application enables you to view and analyze your uploaded data.

The resulting data is uploaded to the GPR Core Platform as datasets. The agent and customer datasets are given the particular names of Agent Profile and Customer Profile. A dataset is a collection of raw event data. The primary purpose of a dataset is to be the source of predictor data.

  • You configure the dataset schemas and upload parameters using configuration options set on the Data Loader <tt>Application</tt> object.
  • After Data Loader imports a dataset, you can append additional data as long as it is consistent with the schema that has already been established.

Create predictors and models

Note: Only STAFF users can create predictors and models.

Predictors are based on the dataset information that you have imported.

  • A predictor defines a view on that underlying dataset. It can select from some or all of the data in the dataset; you can use a predictor with multiple datasets.
  • A model is based on a predictor, and uses the same target metric or KVP as that predictor. You can configure multiple models for each predictor. These models can use different selections of the features available in the underlying dataset. Models are the objects actually used to perform agent scoring and interaction matchups.

As you configure a predictor, you can choose which metric you want to work with, what kinds of situations you want to evaluate, and other parameters, constructing a way to determine the Next Best Action in the specified situation, based on the possible actions available at that time. As you gather more data, you can add that new data to your dataset, and have the predictor test against the actual results coming in, enabling you to refine how successful your predictor is.

Reporting and analysis

You can report on various parameters, such as:

  • The success of your predictors.
  • The results of A/B testing.
  • The factors affecting a KPI you are trying to influence.

Reports are available through the following reporting applications:

Comments or questions about this documentation? Contact us for support!