Flow Trigger Endpoints
Use the /run and /webhook endpoints to run flows.
To create, read, update, and delete flows, see Flow management endpoints.
Run flow
Points to note
Robility flow automatically generates Python, JavaScript, and curl code snippets for the /v1/run/$FLOW_ID endpoint for all flows. For more information, see Generate API code snippets.
Execute a specified flow by ID or name. Flow IDs can be found on the code snippets on the API access pane or in a flow’s URL.
The following example runs the Basic Prompting template flow with flow parameters passed in the request body. This flow requires a chat input string (input_value) and uses default values for all other parameters.
curl -X POST \
“$ROBILITY FLOW_SERVER_URL/api/v1/run/$FLOW_ID” \
-H “Content-Type: application/json” \
-H “x-api-key: $ROBILITY FLOW_API_KEY” \
-d ‘{
“input_value”: “Tell me about something interesting!”,
“session_id”: “chat-123”,
“input_type”: “chat”,
“output_type”: “chat”,
“output_component”: “”,
“tweaks”: null
}’
The response from /v1/run/$FLOW_ID includes metadata, inputs, and outputs for the run.
If you are parsing the response in an application, you most likely need to extract the relevant content from the response, rather than pass the entire response back to the user. For an example of a script that extracts data from a Robility flow API response, see the Quickstart.
Stream LLM token responses
With /v1/run/$FLOW_ID, the flow is executed as a batch with optional LLM token response streaming.
To stream LLM token responses, append the ?stream=true query parameter to the request:
curl -X POST \
“$ROBILITY FLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true” \
-H “accept: application/json” \
-H “Content-Type: application/json” \
-H “x-api-key: $ROBILITY FLOW_API_KEY” \
-d ‘{
“message”: “Tell me something interesting!”,
“session_id”: “chat-123”
}’
LLM chat responses are streamed back as token events, culminating in a final end event that closes the connection.
Result
Run endpoint headers
Header | Info | Example |
---|---|---|
Content-Type | Required. Specifies the JSON format. | "application/json" |
accept | Optional. Specifies the response format. | "application/json" |
x-api-key | Optional. Required only if authentication is enabled. | "sk-..." |
Run endpoint parameters
Parameter | Type | Info |
---|---|---|
flow_id | UUID/string | Required. Part of URL: /run/$FLOW_ID |
stream | Boolean | Optional. Query parameter: /run/$FLOW_ID?stream=true |
input_value | string | Optional. JSON body field. Main input text/prompt. Default: null |
input_type | string | Optional. JSON body field. Input type ("chat" or "text"). Default: "chat" |
output_type | string | Optional. JSON body field. Output type ("chat", "any", "debug"). Default: "chat" |
output_component | string | Optional. JSON body field. Target component for output. Default: "" |
tweaks | object | Optional. JSON body field. Component adjustments. Default: null |
session_id | string | Optional. JSON body field. Conversation context ID. See Session ID. Default: null |
Request example with all headers and parameters
curl -X POST \
“$ROBILITY FLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true” \
-H “Content-Type: application/json” \
-H “accept: application/json” \
-H “x-api-key: $ROBILITY FLOW_API_KEY” \
-d ‘{
“input_value”: “Tell me a story”,
“input_type”: “chat”,
“output_type”: “chat”,
“output_component”: “chat_output”,
“session_id”: “chat-123”,
“tweaks”: {
“component_id”: {
“parameter_name”: “value”
}
}
}’
Webhook run flow
Use the /webhook endpoint to start a flow by sending an HTTP POST request.
Points to note
After you add a Webhook component to a flow, open the API access pane, and then click the Webhook curl tab to get an automatically generated POST /webhook request for your flow. For more information, see Trigger flows with webhooks.
curl -X POST \
“$ROBILITY FLOW_SERVER_URL/api/v1/webhook/$FLOW_ID” \
-H “Content-Type: application/json” \
-H “x-api-key: $ROBILITY FLOW_API_KEY” \
-d ‘{“data”: “example-data”}’
Deprecated flow trigger endpoints
The following endpoints are deprecated and replaced by the /run endpoint:
- /process
- /predict