Setting Up Your LLM

To enable Sisense Generative AI features using Large Language Models (LLMs), you must first configure and deploy an LLM provider, such as OpenAI or Azure OpenAI Service. This process involves creating and deploying a resource, selecting a supported model and region, and setting up access credentials. Once the resource is deployed, you must configure Sisense with the correct provider settings, including the model name, API key, and endpoint URL. Additionally, if you are using OpenAI, you must ensure your API key has the necessary permissions. Sisense currently supports several versions of GPT models, and it is your responsibility to ensure the correct version is configured for optimal compatibility.

Creating a Resource

Creating and Deploying an Azure OpenAI Service Resource

  1. From the Region drop-down, select a region that supports the model version.

  2. In the Name field, type an instance name. The instance name will be part of the endpoint name (Base URL).

    For more information, see Create and Deploy an Azure resource and Azure OpenAI Service Models.

  3. Follow the Azure documentation to deploy a base model on the resource you created. For more information, see Azure OpenAI Service Models.

    1. In the Sisense LLM Configuration, set the following:

      • Provider - Azure OpenAI

      • Model Name - The model name that you set in Azure

      • Base URL

      • API Key

Creating and Setting OpenAI API Key and Permissions

  1. Create an API key to access the OpenAI API.

  2. The Sisense API requires Write access for your Model capabilities. Therefore, when creating/editing your secret key, set the Permissions to either All or Restricted. If you set the permissions to "Read only" your Sisense GenAI features will not work.

If you set the Permissions to "Restricted", you must also set the Permissions for the "Model capabilities" to Write.

  1. Save your key.

  2. In the Sisense LLM Configuration, set the following:

    • Provider - OpenAI

    • Model Name - The OpenAI model name

    • API Key

Supported LLM Providers and Model Versions

Sisense currently supports OpenAI foundation models hosted either directly through OpenAI or through Azure OpenAI Services.

From time to time, additional supported model versions may be added after quality verifications.

It is your responsibility to manage your model version.

Model

Version

Azure OpenAI

OpenAI

GPT-4o-mini

gpt-4o-mini-0718

GPT-4o

gpt-4o-0513

GPT-3.5

gpt-35-turbo-0125