LLM Configuration Management

Estimated reading: 2 minutes

LLM Configuration Management in Robility provides a centralized governance framework to configure, manage, and control access to LLM providers, enabling users and automations to consume configured language models within Robility Manager across the tenant.

Tenant administrators have exclusive permissions to manage and modify this configuration, including the ability to integrate external language model providers such as Azure OpenAI and Google Vertex AI. The configured details can be utilized through the Robility LLM component within a workflow, which retrieves the configuration dynamically at runtime, eliminating the need to hardcode provider settings or credentials.

Purpose

1. Centralize the configuration of LLM endpoints, API keys, and deployments.
2. Securely store and manage authentication credentials in an encrypted manner. All credentials are stored in the vault configured in the vault settings section, enabling seamless injection into automated workflows without hardcoding sensitive data into scripts or configuration files.
3. Enable seamless integration of AI models into workflows.
4. Ensure that updates to provider configurations are applied instantly across all dependent workflows.

Configuration

Currently, we support the following providers, and additional providers will be added in the future.

To configure LLMs for your workflows, connect to these AI providers:
     • Azure OpenAI
     • Google Vertex AI
     • Azure OpenAI Embeddings
This allows administrators to define endpoints, API keys, and deployment details for seamless AI integration into automation workflows.

LLM Providers

To add LLM providers, configure the following parameters:

Provider Parameter Description
Azure OpenAI Azure Endpoint* Specify the URL endpoint for your Azure OpenAI resource.
Model / Deployment* Specify the name of the deployment model.
Version* Specify the version of the model.
API Key* Provide the secret key generated in Azure to authenticate requests.
Google Vertex AI Model Name* Specify the name of your deployed Vertex AI model.
Service Account JSON* Upload the JSON file with credentials for secure access to Vertex AI resources.

*Represents mandatory fields.

Embedding & Runtime Configuration

For embedding details, configure the following parameters for Azure OpenAI Embeddings:

Parameter Description
Azure Endpoint* Specify the URL endpoint for your Azure OpenAI embedding resource.
Embedding Model* Specify the name of the model used to generate embeddings.
Deployment Name* Specify the name of the model deployment for API access.
API Version* Specify the version of the model used to generate embeddings.
API Key* Provide the secret key for authentication.

*Represents mandatory fields.

Share this Doc

LLM Configuration Management

Or copy link

CONTENTS