Custom LLM Model in GenAI
In the Agent AI widget, the Disposition Summary (end-of-conversation summary) is automatically generated at the end of a conversation using a GenAI model. You can select either the out-of-the-box (OOTB) Kore Summarization model or the Custom Dialog Task model. By default, the OOTB Kore Summarization option uses Kore’s GenAI model, but you can configure your own model by creating a Custom LLM. This document describes the process for creating and using a Custom LLM.
Steps to Create a Custom LLM¶
- Sign in to AI for Service.
- Go to Agent AI, and click Generative AI Tools from the left navigation.
-
Click Models Library > New Model, and select Custom Integration from the dropdown list.
-
Enter the following inputs:
- Integration Name: Any name to identify the provider or LLM Models group you want to integrate with.
- Model Name: Enter an LLM name connected to using the above mentioned integration. You can add multiple models with the same endpoint.
- Endpoint (POST): Enter an URL to connect to the interface to interact with the LLM through API requests.
- Click the Headers option, and enter values in the Key and Value fields:
- Key: Enter a key name; for example, api-key.
- Value: Enter a value for the key. For example, 3d4f1cOdxxxxxxxxxxxxxxx.
- Select the LLM and Generative Policy guidelines checkbox.
-
Enter a JSON request body (can consist of the model, prompt, or any other relevant parameter), and click Test.
-
In the Conversation Summary page:
- Name: Enter a name. For example, Summary Prompt.
- Feature: Select Conversation Summary from the dropdown list.
- Choose Model: Select the custom model you created from the dropdown list. For example, Custom test mode - gpt-4o.
-
Click Test.
- Enter your JSON content in the Text Response Path field. For example, choices[0].message.content.
- Click Lookup Path. Summary gets generated.
- Click Save.