Responses
Response Types
Data models and types for API responses.
any_llm.types.responses
ResponsesParams
Bases: BaseModel
Normalized parameters for responses API.
This model is used internally to pass structured parameters from the public API layer to provider implementations, avoiding very long function signatures while keeping type safety.
Source code in src/any_llm/types/responses.py
input
instance-attribute
The input payload accepted by provider's Responses API. For OpenAI-compatible providers, this is typically a list mixing text, images, and tool instructions, or a dict per OpenAI spec.
max_output_tokens = None
class-attribute
instance-attribute
Maximum number of tokens to generate
model
instance-attribute
Model identifier (e.g., 'mistral-small-latest')
parallel_tool_calls = None
class-attribute
instance-attribute
Whether to allow parallel tool calls
reasoning = None
class-attribute
instance-attribute
Configuration options for reasoning models.
response_format = None
class-attribute
instance-attribute
Format specification for the response
stream = None
class-attribute
instance-attribute
Whether to stream the response
stream_options = None
class-attribute
instance-attribute
Additional options controlling streaming behavior
temperature = None
class-attribute
instance-attribute
Controls randomness in the response (0.0 to 2.0)
tool_choice = None
class-attribute
instance-attribute
Controls which tools the model can call
tools = None
class-attribute
instance-attribute
List of tools for tool calling. Should be converted to OpenAI tool format dicts
top_logprobs = None
class-attribute
instance-attribute
Number of top alternatives to return when logprobs are requested
top_p = None
class-attribute
instance-attribute
Controls diversity via nucleus sampling (0.0 to 1.0)