/containers
Manage OpenAI code interpreter containers (sessions) for executing code in isolated environments.
| Feature | Supported |
|---|---|
| Cost Tracking | โ |
| Logging | โ (Full request/response logging) |
| Load Balancing | โ |
| Proxy Server Support | โ Full proxy integration with virtual keys |
| Spend Management | โ Budget tracking and rate limiting |
| Supported Providers | openai |
Supported Providers:โ
Quick Startโ
Containers provide isolated execution environments for code interpreter sessions. You can create, list, retrieve, and delete containers.
SDK, PROXY, and OpenAI Clientโ
- SDK
- LiteLLM PROXY Server
- OpenAI Python Client
Create a Container
import litellm
import os
# setup env
os.environ["OPENAI_API_KEY"] = "sk-.."
container = litellm.create_container(
name="My Code Interpreter Container",
custom_llm_provider="openai",
expires_after={
"anchor": "last_active_at",
"minutes": 20
}
)
print(f"Container ID: {container.id}")
print(f"Container Name: {container.name}")
### ASYNC USAGE ###
# container = await litellm.acreate_container(
# name="My Code Interpreter Container",
# custom_llm_provider="openai",
# expires_after={
# "anchor": "last_active_at",
# "minutes": 20
# }
# )
List Containers
from litellm import list_containers, alist_containers
import os
os.environ["OPENAI_API_KEY"] = "sk-.."
containers = list_containers(
custom_llm_provider="openai",
limit=20,
order="desc"
)
print(f"Found {len(containers.data)} containers")
for container in containers.data:
print(f" - {container.id}: {container.name}")
### ASYNC USAGE ###
# containers = await alist_containers(
# custom_llm_provider="openai",
# limit=20,
# order="desc"
# )
Retrieve a Container
from litellm import retrieve_container, aretrieve_container
import os
os.environ["OPENAI_API_KEY"] = "sk-.."
container = retrieve_container(
container_id="cntr_123...",
custom_llm_provider="openai"
)
print(f"Container: {container.name}")
print(f"Status: {container.status}")
print(f"Created: {container.created_at}")
### ASYNC USAGE ###
# container = await aretrieve_container(
# container_id="cntr_123...",
# custom_llm_provider="openai"
# )
Delete a Container
from litellm import delete_container, adelete_container
import os
os.environ["OPENAI_API_KEY"] = "sk-.."
result = delete_container(
container_id="cntr_123...",
custom_llm_provider="openai"
)
print(f"Deleted: {result.deleted}")
print(f"Container ID: {result.id}")
### ASYNC USAGE ###
# result = await adelete_container(
# container_id="cntr_123...",
# custom_llm_provider="openai"
# )
$ export OPENAI_API_KEY="sk-..."
$ litellm
# RUNNING on http://0.0.0.0:4000
Custom Provider Specification
You can specify the custom LLM provider in multiple ways (priority order):
- Header:
-H "custom-llm-provider: openai" - Query param:
?custom_llm_provider=openai - Request body:
{"custom_llm_provider": "openai", ...} - Defaults to "openai" if not specified
Create a Container
# Default provider (openai)
curl -X POST "http://localhost:4000/v1/containers" \
-H "Authorization: Bearer sk-1234" \
-H "Content-Type: application/json" \
-d '{
"name": "My Container",
"expires_after": {
"anchor": "last_active_at",
"minutes": 20
}
}'
# Via header
curl -X POST "http://localhost:4000/v1/containers" \
-H "Authorization: Bearer sk-1234" \
-H "custom-llm-provider: openai" \
-H "Content-Type: application/json" \
-d '{
"name": "My Container"
}'
# Via query parameter
curl -X POST "http://localhost:4000/v1/containers?custom_llm_provider=openai" \
-H "Authorization: Bearer sk-1234" \
-H "Content-Type: application/json" \
-d '{
"name": "My Container"
}'
List Containers
curl "http://localhost:4000/v1/containers?limit=20&order=desc" \
-H "Authorization: Bearer sk-1234"
Retrieve a Container
curl "http://localhost:4000/v1/containers/cntr_123..." \
-H "Authorization: Bearer sk-1234"
Delete a Container
curl -X DELETE "http://localhost:4000/v1/containers/cntr_123..." \
-H "Authorization: Bearer sk-1234"
Setup
from openai import OpenAI
client = OpenAI(
api_key="sk-1234", # Your LiteLLM proxy key
base_url="http://localhost:4000" # LiteLLM proxy URL
)
Create a Container
container = client.containers.create(
name="test-container",
expires_after={
"anchor": "last_active_at",
"minutes": 20
},
extra_body={"custom_llm_provider": "openai"}
)
print(f"Container ID: {container.id}")
print(f"Container Name: {container.name}")
print(f"Created at: {container.created_at}")
List Containers
containers = client.containers.list(
limit=20,
extra_body={"custom_llm_provider": "openai"}
)
print(f"Found {len(containers.data)} containers")
for container in containers.data:
print(f" - {container.id}: {container.name}")
Retrieve a Container
container = client.containers.retrieve(
container_id="cntr_6901d28b3c8881908b702815828a5bde0380b3408aeae8c7",
extra_body={"custom_llm_provider": "openai"}
)
print(f"Container: {container.name}")
print(f"Status: {container.status}")
print(f"Last active: {container.last_active_at}")
Delete a Container
result = client.containers.delete(
container_id="cntr_6901d28b3c8881908b702815828a5bde0380b3408aeae8c7",
extra_body={"custom_llm_provider": "openai"}
)
print(f"Deleted: {result.deleted}")
print(f"Container ID: {result.id}")
Container Parametersโ
Create Container Parametersโ
| Parameter | Type | Required | Description |
|---|---|---|---|
name | string | Yes | Name of the container |
expires_after | object | No | Container expiration settings |
expires_after.anchor | string | No | Anchor point for expiration (e.g., "last_active_at") |
expires_after.minutes | integer | No | Minutes until expiration from anchor |
file_ids | array | No | List of file IDs to include in the container |
custom_llm_provider | string | No | LLM provider to use (default: "openai") |
List Container Parametersโ
| Parameter | Type | Required | Description |
|---|---|---|---|
after | string | No | Cursor for pagination |
limit | integer | No | Number of items to return (1-100, default: 20) |
order | string | No | Sort order: "asc" or "desc" (default: "desc") |
custom_llm_provider | string | No | LLM provider to use (default: "openai") |
Retrieve/Delete Container Parametersโ
| Parameter | Type | Required | Description |
|---|---|---|---|
container_id | string | Yes | ID of the container to retrieve/delete |
custom_llm_provider | string | No | LLM provider to use (default: "openai") |
Response Objectsโ
ContainerObjectโ
{
"id": "cntr_123...",
"object": "container",
"created_at": 1234567890,
"name": "My Container",
"status": "active",
"last_active_at": 1234567890,
"expires_at": 1234569090,
"file_ids": []
}
ContainerListResponseโ
{
"object": "list",
"data": [
{
"id": "cntr_123...",
"object": "container",
"created_at": 1234567890,
"name": "My Container",
"status": "active"
}
],
"first_id": "cntr_123...",
"last_id": "cntr_456...",
"has_more": false
}
DeleteContainerResultโ
{
"id": "cntr_123...",
"object": "container.deleted",
"deleted": true
}