Difference between revisions of "PE-GPR/9.0.0/Deployment/DL-CfgFile"

From Genesys Documentation
Jump to: navigation, search
(Published)
 
Line 8: Line 8:
 
|Section={{Section
 
|Section={{Section
 
|sectionHeading=Overview
 
|sectionHeading=Overview
|Type=Unstructured
+
|Standalone=No
 +
|ComingSoon=No
 +
|alignment=Vertical
 
|anchor=overview
 
|anchor=overview
|freetext=The [https://docs.genesys.com/Documentation/Options/Draft/GPM/Welcome Predictive Routing Options Reference] contains a complete list of all Predictive Routing options, with descriptions, default and valid values, and a brief explanation of how to use them.
+
|structuredtext=The [https://docs.genesys.com/Documentation/Options/Draft/GPM/Welcome Predictive Routing Options Reference] contains a complete list of all Predictive Routing options, with descriptions, default and valid values, and a brief explanation of how to use them.
  
 
This topic focuses specifically on data-upload configuration and provides a task-based perspective, enabling you to find which options you need to configure for specific upload types and giving more context to help you decide on the appropriate values for your environment.
 
This topic focuses specifically on data-upload configuration and provides a task-based perspective, enabling you to find which options you need to configure for specific upload types and giving more context to help you decide on the appropriate values for your environment.
Line 18: Line 20:
 
}}{{Section
 
}}{{Section
 
|sectionHeading=The dataset-<name> Section
 
|sectionHeading=The dataset-<name> Section
|Type=Unstructured
+
|Standalone=No
 +
|ComingSoon=No
 +
|alignment=Vertical
 
|anchor=dataset-name
 
|anchor=dataset-name
|freetext=The '''dataset-<name>''' section is used to configure Datasets. The <name> part of the section name should be replaced by a Dataset name of your choice.  
+
|structuredtext=The '''dataset-<name>''' section is used to configure Datasets. The <name> part of the section name should be replaced by a Dataset name of your choice.  
  
 
Create multiple '''dataset-*''' sections, each containing the necessary options. Each one corresponds to a Dataset.  
 
Create multiple '''dataset-*''' sections, each containing the necessary options. Each one corresponds to a Dataset.  
Line 62: Line 66:
 
}}{{Section
 
}}{{Section
 
|sectionHeading=The schema-&lt;name&gt; Section
 
|sectionHeading=The schema-&lt;name&gt; Section
|Type=Unstructured
+
|Standalone=No
 +
|ComingSoon=No
 +
|alignment=Vertical
 
|anchor=schema-name
 
|anchor=schema-name
|freetext=The '''schema-<name>''' section is used to define the schema for the Dataset with the matching <name>. That is, the '''schema-mydataset''' section contains options describing the data schema to be used in the Dataset configured in the '''dataset-mydataset''' section.
+
|structuredtext=The '''schema-<name>''' section is used to define the schema for the Dataset with the matching <name>. That is, the '''schema-mydataset''' section contains options describing the data schema to be used in the Dataset configured in the '''dataset-mydataset''' section.
  
 
The options in the '''schema-<name>''' sections depend on the requirements for the specific Dataset. Each option name is a Dataset column name. The option values are the value for the field (which might be the joined schema from which the value is obtained), the datatype, and, if the field contains sensitive or PII data, the value <code>anon</code>.
 
The options in the '''schema-<name>''' sections depend on the requirements for the specific Dataset. Each option name is a Dataset column name. The option values are the value for the field (which might be the joined schema from which the value is obtained), the datatype, and, if the field contains sensitive or PII data, the value <code>anon</code>.

Revision as of 13:45, January 17, 2020

Create and specify values for the configuration options in the Data Loader Application object to select data from Genesys Info Mart, specify the correct schemas for your data, upload data, and join data.

Related documentation:

Overview

The Predictive Routing Options Reference contains a complete list of all Predictive Routing options, with descriptions, default and valid values, and a brief explanation of how to use them.

This topic focuses specifically on data-upload configuration and provides a task-based perspective, enabling you to find which options you need to configure for specific upload types and giving more context to help you decide on the appropriate values for your environment.

  • These sections and options are configured on the Data Loader Application object in your configuration manager application (GAX, Genesys Administrator).

The dataset-<name> Section

The dataset-<name> section is used to configure Datasets. The <name> part of the section name should be replaced by a Dataset name of your choice.

Create multiple dataset-* sections, each containing the necessary options. Each one corresponds to a Dataset.

Note that a Dataset configured in a dataset-<name> section can include fields joined from other Datasets, and/or from the Agent Profile and Customer Profile. Use the following options to configure your Datasets:

  • chunk-size
  • data-type
  • end-date
  • join
  • join-keys
  • join-type
  • location
  • sql-query
  • start-date
  • update-period
  • upload-dataset

NOTE: The following section names are reserved for the Agent Profile Dataset and Customer Profile Dataset, and cannot be used for any other purpose. The options required for these special sections are listed below the section names. If an option has a value specified, that is the mandatory value for that option when used in that section.

  • dataset-agents-gim
    • sql-query
    • data-type=agents
    • enforce-schema-on-joined-data
    • join
    • join-type
    • upload-dataset
  • dataset-agents-csv
    • csv-separator
    • data-type=agents
    • join-keys
    • location
    • upload-dataset=false
  • dataset-customers
    • csv-separator
    • data-type=customers
    • join-keys
    • location
    • upload-dataset

The schema-<name> Section

The schema-<name> section is used to define the schema for the Dataset with the matching <name>. That is, the schema-mydataset section contains options describing the data schema to be used in the Dataset configured in the dataset-mydataset section.

The options in the schema-<name> sections depend on the requirements for the specific Dataset. Each option name is a Dataset column name. The option values are the value for the field (which might be the joined schema from which the value is obtained), the datatype, and, if the field contains sensitive or PII data, the value anon.

The following examples show possible schemas:

[schema-interactions-joined]

  • AHT=numeric
  • Customer_field_1=schema-customers
  • CUSTOMER HOLD DURATION=numeric
  • CUSTOMER TALK DURATION=numeric
  • CUSTOMER_ACW_DURATION=numeric
  • Customer_field_2=schema-customers
  • Customer_ANI=schema-customers
  • employeeId=schema-agents-gim
  • Feedback=schema-feedback
  • groupNames=schema-agents-gim
  • InteractionID=string
  • Is_Resolved=schema-feedback
  • skills=schema-agents-gim
  • START_TS=timestamp,created_at

[schema-feedback]

  • Customer_ANI=string,anon
  • employeeId=string,anon
  • Feedback=string
  • InteractionID=string
  • Is_Resolved=boolean
  • Timestamp_Feedback=timestamp,created_at

NOTE: The following section names are reserved for the Agent Profile schemas and the Customer Profile schema, and cannot be used for any other purpose. The options required for these schemas are listed under the section names.

  • schema-agents-gim
    • agent_field_1
    • agent_field_2
    • dbID
    • employeeId
    • firstName
    • groupNames
    • lastName
    • skills=
    • tenantDbID
    • userName
  • schema-agents-<name> - shows how to configure the schema for an agent Dataset to be uploaded from a CSV file
    • agent_field_1
    • agent_field_2=timestamp
    • employeeId=string,id,anon
  • schema-customers
    • Customer_ANI=string,id,anon
    • Customer_field_1=numeric
    • Customer_field_2=string,anon
Comments or questions about this documentation? Contact us for support!