LLM Integration

Sisense Intelligence uses Large Language Models (LLMs) as the basis for our advanced analytical features. Sisense offers both managed LLM integration and bring-your-own LLM (BYO-LLM) options:

  • Managed LLM: For Generative AI features, Sisense can provide a managed Azure-hosted LLM. Currently, this is only available for Narrative, but managed LLM support will be extended to all Generative AI features in the future.

  • BYO-LLM: Customers supply their own LLM (OpenAI or Azure OpenAI) for all features except Narrative which requires a separate license. The LLM API keys are configured directly by the customer.

Data Sent to the LLM

  • User prompts submitted through the assistant

  • Datasource semantic metadata (e.g., field names or descriptions)

  • Aggregated query result set

  • Sample column values & statistics (e.g., uniqueness, row count and range)

Feature

Train on your data

User prompt

Datasource semantic metadata

Agg. query result set

Sample column values & statistics

Explanations

-

-

-

-

-

Forecast 

-

-

-

-

-

Trend

-

-

-

-

-

Exploration Paths

-

-

-

-

-

Simply Ask (Legacy NLQ)

-

-

-

-

-

Assistant

-

-

-

Narrative

-

-

-

-

Semantic Enrichment

-

-

-

Smart Value Matching

-

-

-

-

-