Skip to content

Config

Main configuration class for agent initialization.

FieldTypeDescription
model_idstrSelect the underlying model used by the agent. If you are using the default model_type (AnyLLM), you can refer to AnyLLM Provider Docs for the list of providers and how to access them.
api_basestr | NoneCustom API endpoint URL for the model provider. Use this to specify custom endpoints for local models (Ollama, llama.cpp, etc.) or proxy services. For example: http://localhost:11434/v1 for Ollama.
api_keystr | NoneAPI key for authenticating with the model provider. By default, any-llm automatically searches for common environment variables (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.). Only set this explicitly when using custom environment variable names or providing keys dynamically.
descriptionstr | NoneDescription of the agent.
namestrThe name of the agent. Defaults to any_agent.
instructionsstr | NoneSpecify the instructions for the agent (often also referred to as a system_prompt).
toolslist[str | MCPStdio | MCPSse | MCPStreamableHttp | Callable[..., Any]]List of tools to be used by the agent. See more info at Tools.
callbackslist[Callback]List of callbacks to use during agent invocation. See more info at Callbacks.
agent_typeCallable[..., Any] | NoneControl the type of agent class that is used by the framework, and is unique to the framework used. Check the individual Frameworks pages for more info on the defaults.
agent_argsMutableMapping[str, Any] | None
model_typeCallable[..., Any] | NoneControl the type of model class that is used by the agent framework, and is unique to the agent framework being used. For each framework, we use AnyLLM as the default model_type, allowing you to use the same model_id syntax across these frameworks.
model_argsMutableMapping[str, Any] | NonePass arguments to the model instance like temperature, top_k, as well as any other provider-specific parameters. Refer to any-llm Completion API Docs for more info.
any_llm_argsMutableMapping[str, Any] | NonePass arguments to AnyLLM.create() when using integrations backed by any-llm. Use this for provider/client initialization options that are not completion-time generation params (which should be passed via model_args).
output_typetype[BaseModel] | NoneControl the output schema from calling run. By default, the agent will return a type str. Using this parameter you can define a Pydantic model that will be returned by the agent run methods.

Configuration for running an MCP server as a local subprocess.

FieldTypeDescription
commandstrThe executable to run to start the server. For example, docker, uvx, npx.
argsSequence[str]
envdict[str, str] | NoneThe environment variables to set for the server.
toolsSequence[str] | None
client_session_timeout_secondsfloat | Nonethe read timeout passed to the MCP ClientSession.

Configuration for connecting to an MCP server via Streamable HTTP transport.

FieldTypeDescription
urlstrThe URL of the server.
headersMapping[str, str] | NoneThe headers to send to the server.
toolsSequence[str] | None
client_session_timeout_secondsfloat | Nonethe read timeout passed to the MCP ClientSession.

Configuration for connecting to an MCP server via SSE transport (deprecated).

FieldTypeDescription
urlstrThe URL of the server.
headersMapping[str, str] | NoneThe headers to send to the server.
toolsSequence[str] | None
client_session_timeout_secondsfloat | Nonethe read timeout passed to the MCP ClientSession.

Configuration for serving agents via the Agent2Agent Protocol.


Configuration for serving agents via the Model Context Protocol.

FieldTypeDescription
hoststrWill be passed as argument to uvicorn.run.
portintWill be passed as argument to uvicorn.run.
endpointstrWill be pass as argument to Starlette().add_route
log_levelstrWill be passed as argument to the uvicorn server.
versionstr

Enum of supported agent frameworks.

NameValue
GOOGLEgoogle
LANGCHAINlangchain
LLAMA_INDEXllama_index
OPENAIopenai
AGNOagno
SMOLAGENTSsmolagents
TINYAGENTtinyagent