Generative AI for Self-Hosted Environments - BETA
Sisense Intelligence empowers analytics through advanced Generative AI capabilities. While fully integrated into Sisense Managed Cloud environments, self-hosted customers can now leverage Generative AI through an independently managed infrastructure. This enables on-premises or self-hosted environments to maintain complete control while benefiting from powerful AI-driven analytics.
Note:
-
Self-hosted Generative AI capabilities are currently in beta. Participating customers must remain agile, adapting to periodic updates and expansions to Sisense’s Generative AI architecture. For additional support, contact your Sisense Customer Success Manager or Account Representative.
-
This topic is specifically for setting up self-hosted Generative AI. For more background and general information about Generative AI, see Generative AI - Empowering Your Analytics Experience.
Uniqueness of Self-Hosted Environments
In self-hosted scenarios, customers directly manage their own infrastructure, including the provisioning and maintenance of both the Vector Database (MongoDB Atlas VDB) and Large Language Model (LLM) services, through Azure OpenAI or OpenAI APIs. This setup provides enhanced data sovereignty and deeper integration possibilities within customers' own IT ecosystems.
Prerequisites
For the successful use of AI capabilities in a self-hosted environment, ensure the following:
-
Sisense version L2025.2 or higher, regularly updated to align with evolving beta features
-
MongoDB Atlas Cluster (minimum recommended size: M10)
-
Public Network Access - Open web access through an IP allowlist
-
Public IP Defined and accessible for your Sisense instance
-
Monitor and manage the deployment in line with Sisense guidance
-
Accept that additional services may be required as the architecture matures
Getting Started
Before enabling Generative AI, ensure you have access to your LLM key and Vector DB connection string.
-
For detailed instructions on how to set up your LLM, and supported models, see Setting Up Your LLM.
-
For detailed instructions on how to set up your VDB, see Setting Up Your Vector Database (VDB).
Enabling GenAI
GenAI can be enabled (or disabled) as follows by a Sisense Administrator:
-
Search for “Sisense Intelligence” in the search bar or open the App Configuration drop-down.
-
Click Sisense Intelligence.
-
Enable Generative AI via the toggle.
When this toggle is disabled, you will not be able to access the GenAI features that require your own LLM (Narrative is controlled separately).
Connecting Your LLM
To enable Generative AI features, Sisense requires integration with a supported Large Language Model (LLM). This section outlines how to connect your LLM.
-
Accept the Terms of Use - After enabling AI on your system, you must consent to the general AI terms and conditions.
In order to proceed, you must configure your own LLM API key.
-
Choose your LLM Provider - Open the Provider drop-down list and select your preferred provider.
-
Configure your LLM Connection - Once you have selected your provider, enter the configuration details, including your LLM API key, to complete the setup.
-
Azure OpenAI:
Name |
Description |
Example |
Model Name |
Model deployment name you chose during the deployment process on your LLM platform. |
my-gpt-4o-mini |
Base URL |
Endpoint URL of your LLM instance. |
https://MyLLM.openai.azure.com/ |
API Key |
API key for authenticating and authorizing your requests to the model's API. |
<your_api_key> |
-
OpenAI:
Name |
Description |
Example |
Model Name |
Model name corresponding to your OpenAI API key (see the OpenAI supported models list). This should be the exact model identifier used for API requests. Note: Ensure that the key permissions are set to All (not Restricted/ReadOnly). See Setting Up Your LLM for more information. |
gpt-4o-mini-2024-07-18 |
API Key |
API key for authenticating and authorizing your requests to the model's API. |
<your_api_key> |
-
Test and Save - Click Test to validate your configuration. Once successful, click Save.
Connecting your Vector Database
To set up your vector database connection:
-
Navigate to the Sisense Intelligence settings under Admin > App Configuration.
-
Under the Vector Database (Beta) section, in the Vector Database Connection String field, paste the connection string for your supported vector database to connect to your vector database.
-
Click Test to validate the connection.
-
Once the test is successful, click Save to apply your configuration.
API Reference
To set up your LLM connection via REST API:
-
Under Admin, navigate to REST API and select version 2.0.
-
Use
POST /settings/ai/llm/
to add or update all AI settings used for AI settings configuration. -
Use the
POST /ai/llm/test
method to test the connection to your LLM deployment.
To set up your Vector DB connection via REST API:
-
Under Admin, navigate to REST API and select version 2.0.
-
Use
POST /settings/ai/vdb/
to add or update the VDB connection string used for AI settings configuration. -
Use the
POST /ai/vdb/test
method to test the connection to your VDB.