How to use a Custom LLM provider for GenAI Copilot

 

By default, the DataClarity Unlimited Analytics platform uses OpenAI's API to power:

  • Copilot ask any questions – natural language to charts and dashboards (Chat with Your Data)

  • Copilot explain data – AI-generated insights using customizable GenAI templates

You can override the default configuration to use any provider that is OpenAI API–compatible, including Azure OpenAI or self-hosted alternatives.

 


Default Configuration

Set the following configuration variables in the Storyboards application:

Default sample/values:

core.open-ai.url=https://api.openai.com
core.open-ai.model=gpt-3.5-turbo
core.open-ai.max-tokens=16,385

 

core.open-ai.url - Base URL for the OpenAI-compatible endpoint
core.open-ai.model - Default model used by Copilot when translating natural language into data visualizations
core.open-ai.max-tokens - Max number of tokens per request. For model-specific limits, please refer to your LLM service provider's documentation.

 

Important: DataClarity has been heavily tested with gpt-3.5-turbo. If switching to other models, you should validate that chart generation and accuracy meet your expectations.

Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.