The AgentEnv Python SDK (agentenv) provides a programmatic interface for managing sandboxes, snapshots, browser sessions, notebooks, clusters, and more.

Installation

pip install agv

Quick Start

import agentenv as agv

# Create a client
client = agv.Client(api_key="sk_live_xxxxx")

# Create a sandbox and wait for it to start
sandbox = client.sandbox.create(image="python:3.11", cpu=2000, memory=4096)
sandbox.wait_for_status("running")

# Execute a command
result = sandbox.exec("python3 -c 'print(1 + 1)'")
print(result.stdout)

# Expose a port
mapping = sandbox.expose_port(8080, "http")
print(mapping.public_url)

# Get logs
logs = sandbox.logs()

# Clean up
sandbox.stop()
sandbox.delete()
The SDK reads credentials from ~/.agentenv/config.yaml and environment variables automatically. If you have already logged in via agv login, you can omit the api_key:
client = agv.Client()

Client API

Sandboxes

# Create
sandbox = client.sandbox.create(
    image="python:3.11",
    cpu=2000,
    memory=4096,
    name="my-sandbox",
    env={"PYTHONUNBUFFERED": "1"},
    max_lifetime_seconds=900,
)

# List / get
sandboxes = client.sandbox.list_all()
sandbox = client.sandbox.get("sb_abc123")

# Lifecycle
sandbox.wait_for_status("running")
sandbox.stop()
sandbox.start()
sandbox.delete()
sandbox.refresh()  # re-fetch from API

# Execute commands
result = sandbox.exec("echo hello")
print(result.stdout, result.exit_code)

# Ports
sandbox.expose_port(8080, "http")
ports = sandbox.list_ports()
mapping = sandbox.ensure_port(8080, "http")  # idempotent

# HTTP helpers
response = sandbox.http_request(8080, path="/health")
ready = sandbox.http_ready(8080, path="/health", expected_status=200)
ready = sandbox.tcp_ready(5432)

# Snapshot from sandbox
snapshot = sandbox.create_snapshot(name="checkpoint-1")

Snapshots

# Create from sandbox
snapshot = client.snapshot.create("sb_abc123", name="my-snapshot")

# List / get
snapshots = client.snapshot.list_all()
snapshot = client.snapshot.get("snap_abc123")

# Restore to a new sandbox
sandbox = client.snapshot.restore("snap_abc123", name="restored")

Browser Sessions

session = client.browser.create(
    screen_width=1920,
    screen_height=1080,
    stealth=True,
    profile_mode="ephemeral",
)

sessions = client.browser.list_all()

Notebook Sessions

# Create session
session = client.notebook.create(
    workspace_id="wk_abc123",
    type="small",
    image="docker://quay.io/jupyter/datascience-notebook:notebook-7.5.5",
)

# Document management
doc = client.notebook.create_document("wk_abc123", "Analysis")
docs = client.notebook.list_documents("wk_abc123")

# Cell operations
cell = client.notebook.add_cell(doc.id, "print('hello')", cell_type="code")
result = client.notebook.execute_cell(doc.id, cell_id=cell.id, wait=True)
client.notebook.update_cell(doc.id, "print('updated')", cell_id=cell.id)
client.notebook.remove_cell(doc.id, cell_id=cell.id)

# Kernel management
client.notebook.ensure_kernel(doc.id, wait=True, timeout=120)

# Import / export
doc = client.notebook.import_document("wk_abc123", file_path="./notebook.ipynb")
content = client.notebook.export_document(doc.id)

Managed Agents

agent = client.managed_agent.create(
    name="Research Agent",
    workspace_id="wk_abc123",
    upstream_id="up_abc123",
)

agents = client.managed_agent.list_all(workspace_id="wk_abc123")
agent = client.managed_agent.get("agent_abc123")

# Messaging
messages = client.managed_agent.list_messages("agent_abc123")
events = client.managed_agent.send_message("agent_abc123", "summarize the repo")

# Streaming
for event in client.managed_agent.stream_message("agent_abc123", "hello"):
    print(event)

# Lifecycle
client.managed_agent.wake("agent_abc123")
forked = client.managed_agent.fork("agent_abc123")
client.managed_agent.delete("agent_abc123")

Workspaces & Billing

ws = client.workspace.create("My Workspace")
workspaces = client.workspace.list_all()

balance = client.billing.get_balance("wk_abc123")

Remote Functions

The @agv.function decorator runs a Python function inside a sandbox:
import agentenv as agv

@agv.function("small", image="python:3.11-slim")
def add(x, y):
    return x + y

print(add(2, 3))  # 5
The first argument is a preset type (micro, small, medium, large, xl) or a cpu:memory spec. A sandbox is created on each call and destroyed after execution.

With Image Builder

Combine @agv.function with the image builder to install dependencies:
import agentenv as agv

builder = agv.py().python_packages(["numpy", "scipy"])

@agv.function("small", image=builder)
def norm(x):
    import numpy as np
    return float(np.linalg.norm(x))

print(norm([3, 4]))  # 5.0

Image Builder

Build custom images with a fluent API:
import agentenv as agv

image = (
    agv.py("3.11")
    .python_packages(["numpy", "pandas", "scikit-learn"])
    .system_packages(["ffmpeg", "imagemagick"])
    .env({"PYTHONUNBUFFERED": "1"})
    .copy("./src", "/workspace/src")
    .tag("ml-base:latest")
    .build()
)

Builder Methods

MethodDescription
agv.py(version)Start from a Python slim base image
.python_packages(["pkg"])Install pip packages
.system_packages(["pkg"])Install system packages (apt)
.env({"K": "V"})Set environment variables
.workdir("/path")Set working directory
.copy("local", "remote")Copy local files into the image
.run("command")Run a shell command during build
.tag("name:tag")Tag the resulting image
.enable_ray()Install Ray runtime
.enable_spark()Install Spark runtime
.cache("reuse" | "clean")Control build cache behavior
.build()Execute the build and return the image

Ray Clusters

Provision a Ray cluster and connect automatically:
pip install "agv[ray]"
import agentenv as agv

# Shape: "4x2xH100" = head + 4 workers, 2x H100 per worker
# Shape: "2xH100"   = single node (head only), 2x H100
cluster = agv.ray_init("4x2xH100", allow_cidr="1.2.3.4/32")

import ray

@ray.remote
def f(x):
    return x + 1

print(ray.get(f.remote(1)))

cluster.close(stop_cluster=True)

Spark Clusters

Provision a Spark cluster via Spark Connect:
pip install "agv[spark]"
import agentenv as agv

spark = agv.spark_init("4x2xH100", allow_cidr="1.2.3.4/32")
print(spark.range(10).count())

spark.close(stop_cluster=True)

Context Manager

The client supports context manager usage for automatic cleanup:
import agentenv as agv

with agv.Client(api_key="sk_live_xxxxx") as client:
    sandbox = client.sandbox.create(image="python:3.11")
    sandbox.wait_for_status("running")
    result = sandbox.exec("echo hello")
    print(result.stdout)