LLM Configuration Management

Estimated reading: 2 minutes

LLM Configuration Management provides a centralized interface for managing Large Language Model (LLM) providers, which is essential for AI-driven automation workflows. It securely stores and encrypts sensitive credentials, ensuring that access is restricted to authorized users only.

By default, configurations are stored in the Robility Manager. Tenant administrators have permission to add and modify LLM provider information, such as Azure OpenAI and Google Vertex AI, including Azure OpenAI embedding and runtime configurations. The Configuration details added can be used through a custom LLM component in the Robility workflows, which automatically retrieves the configurations via API, eliminating the need to hardcode details in individual workflows.

Purpose
1. Centralize the configuration of LLM endpoints, API keys, and deployments.
2. Securely store and manage authentication credentials in an encrypted manner.
3. Enable seamless integration of AI models into workflows.
4. Ensure that updates to provider configurations are applied instantly across all dependent workflows.

Configuration

Currently, we support the following providers, and additional providers will be added in the future.

To configure LLMs for your workflows, connect to these AI providers:
       Azure OpenAI
       Google Vertex AI
       Azure OpenAI Embeddings
This allows administrators to define endpoints, API keys, and deployment details for seamless AI integration into automation workflows.

LLM Providers

To add LLM providers, configure the following parameters:

Provider Parameter Description
Azure OpenAI Azure Endpoint* Specify the URL endpoint for your Azure OpenAI resource.
Model / Deployment* Specify the name of the deployment model.
Version* Specify the version of the model.
API Key* Provide the secret key generated in Azure to authenticate requests.
Google Vertex AI Model Name* Specify the name of your deployed Vertex AI model.
Service Account JSON* Upload the JSON file with credentials for secure access to Vertex AI resources.

*Represents mandatory fields.

Embedding & Runtime Configuration

For embedding details, configure the following parameters for Azure OpenAI Embeddings:

Parameter Description
Azure Endpoint* Specify the URL endpoint for your Azure OpenAI embedding resource.
Embedding Model* Specify the name of the model used to generate embeddings.
Deployment Name* Specify the name of the model deployment for API access.
API Version* Specify the version of the model used to generate embeddings.
API Key* Provide the secret key for authentication.

*Represents mandatory fields.

Share this Doc

LLM Configuration Management

Or copy link

CONTENTS