Using Callbacks¶
This cookbook illustrates how to implement and use custom callbacks in any-agent.
You can find more information about callbacks in the docs:
%pip install 'any-agent'
import nest_asyncio
nest_asyncio.apply()
import os
from getpass import getpass
for key in ("MISTRAL_API_KEY", "TAVILY_API_KEY"):
if key not in os.environ:
print(f"{key} not found in environment!")
api_key = getpass(f"Please enter your {key}: ")
os.environ[key] = api_key
print(f"{key} set for this session!")
else:
print(f"{key} found in environment.")
from any_agent import AgentConfig, AnyAgent
from any_agent.tools import search_tavily
Running with default callbacks¶
any-agent comes with a set of default callbacks that will be used by default (if you don't pass a value to AgentConfig.callbacks):
agent = AnyAgent.create(
"tinyagent",
AgentConfig(model_id="mistral:mistral-small-latest", tools=[search_tavily]),
)
agent_trace = agent.run("What are 5 LLM agent frameworks that are trending in 2025?")
Adding Callback to offload sensitive information¶
Some inputs and/or outputs in your traces might contain sensitive information that you don't want to be exposed in the traces.
We are going to implement a callback that takes the value of the input messages (which contain the instructions and the user prompt), writes it to an external destination (in this example, a local file), and replaces the value of the attribute in the span with a reference to that external destination.
import json
from pathlib import Path
from any_agent.callbacks.base import Callback
from any_agent.callbacks.context import Context
from any_agent.tracing.attributes import GenAI
class SensitiveDataOffloader(Callback):
def __init__(self, output_dir: str) -> None:
self.output_dir = Path(output_dir)
self.output_dir.mkdir(exist_ok=True, parents=True)
def before_llm_call(self, context: Context, *args, **kwargs) -> Context:
span = context.current_span
if input_messages := span.attributes.get(GenAI.INPUT_MESSAGES):
output_file = self.output_dir / f"{span.get_span_context().trace_id}.txt"
output_file.write_text(str(input_messages))
span.set_attribute(
GenAI.INPUT_MESSAGES, json.dumps({"ref": str(output_file)})
)
return context
We can now provide our callback to the agent.
You can find more information in:
https://mozilla-ai.github.io/any-agent/agents/callbacks/#providing-your-own-callbacks
from any_agent.callbacks import get_default_callbacks
agent = AnyAgent.create(
"tinyagent",
AgentConfig(
model_id="mistral:mistral-small-latest",
tools=[search_tavily],
callbacks=[SensitiveDataOffloader("sensitive-info"), *get_default_callbacks()],
),
)
agent_trace = agent.run("What are 5 LLM agent frameworks that are trending in 2025?")
As you can see in the console output, the input messages in the trace have been now replaced by a reference to the external destination.