Documentation Index
Fetch the complete documentation index at: https://docs.usesynth.ai/llms.txt
Use this file to discover all available pages before exploring further.
Use the managed-research package when you want scripts, CI jobs, notebooks, or typed Python workflows.
Install and authenticate
uv add managed-research
export SYNTH_API_KEY="sk_..."
import os
from managed_research import ManagedResearchClient
client = ManagedResearchClient(api_key=os.environ["SYNTH_API_KEY"])
Start a one-off run
run = client.runs.start(
"Review the project context and propose the smallest high-impact improvement.",
host_kind="daytona",
work_mode="directed_effort",
providers=[{"provider": "openrouter"}],
runbook="lite",
)
print("project:", run.project_id)
print("run:", run.run_id)
Wait and inspect evidence
result = run.wait(timeout=60 * 60, poll_interval=15)
print("state:", result.state.value)
print("stop reason:", result.stop_reason_message or result.stop_reason)
print("tasks:", run.task_counts())
print("actors:", run.actor_counts())
for message in run.messages(limit=20):
print(message.get("role"), message.get("status"), message.get("body", "")[:120])
manifest = run.artifact_manifest()
print("output files:", [artifact.path for artifact in manifest.output_files])
for artifact in run.artifacts():
print(artifact.artifact_id, artifact.artifact_type, artifact.title)
Create a project and preflight
project = client.projects.create(name="Improve my eval runner")
project.repositories.attach(github_repo="owner/repo")
project.context.set_project_knowledge(
"Focus on reproducible evals, clear failure reporting, and reviewable evidence."
)
preflight = project.runs.preflight(
host_kind="daytona",
work_mode="directed_effort",
providers=[{"provider": "openrouter"}],
runbook="lite",
)
if not preflight.clear_to_trigger:
raise RuntimeError(preflight.resolution_reason or "Run is not ready to start")
run = project.runs.start(
"Inspect the eval runner, fix the highest-leverage issue, and explain the evidence.",
host_kind="daytona",
work_mode="directed_effort",
providers=[{"provider": "openrouter"}],
runbook="lite",
)
Branch from a checkpoint
checkpoint = run.create_checkpoint(reason="before trying a riskier refactor")
retry = run.branch_from_checkpoint(
checkpoint_id=checkpoint.checkpoint_id,
mode="with_message",
message="Try again from this checkpoint, but optimize for the smallest patch.",
)
print("branched run:", retry.child_run_id)
Errors and denials
The SDK raises typed exceptions for launch-time failures. Use preflight when you want structured blockers before spending runtime.
Common denials include unsupported harness/model pairs, unsupported reasoning effort, missing auth, budget caps, incomplete project setup, and provider availability.