ToolsAgent class

An agent powered by the tool calling API.

Example:

final llm = ChatOllama(
  defaultOptions: ChatOllamaOptions(
    model: 'llama3-groq-tool-use',
    temperature: 0,
  ),
);
final tools = [CalculatorTool()];
final agent = ToolsAgent.fromLLMAndTools(llm: llm, tools: tools);
final executor = AgentExecutor(agent: agent);
final res = await executor.run('What is 40 raised to the 0.43 power? ');

You can use any chat model that supports tools, like ChatOpenAI, ChatOllama, ChatAnthropic, ChatFirebaseVertexAI, etc. Check the documentation for a complete list.

You can easily add memory to the agent using the memory parameter from the ToolsAgent.fromLLMAndTools constructor. Make sure you enable BaseChatMemory.returnMessages on your memory, as the agent works with ChatMessages. The default prompt template already takes care of adding the history to the prompt. For example:

final memory = ConversationBufferMemory(returnMessages: true);
final agent = ToolsAgent.fromLLMAndTools(
  llm: llm,
  tools: tools,
  memory: memory,
);

If you need to use your own llmChain make sure your prompt template includes:

  • MessagePlaceholder(variableName: agentInputKey): the input to the agent.
  • If you are using memory:
    • MessagesPlaceholder(variableName: '{memoryKey}'): the history of chat messages.
  • If you are not using memory:
    • MessagesPlaceholder(variableName: BaseActionAgent.agentScratchpadInputKey): the intermediary work of the agent (if you are using memory, the agent uses the memory to store the intermediary work). Example:
ChatPromptTemplate.fromTemplates([
  (ChatMessageType.system, 'You are a helpful AI assistant'),
  (ChatMessageType.messagesPlaceholder, 'history'),
  (ChatMessageType.messagePlaceholder, 'input'),
]);

You can use ToolsAgent.createPrompt to build the prompt template if you only need to customize the system message or add some extra messages.

Inheritance

Constructors

ToolsAgent.new({required LLMChain<BaseChatModel<ChatModelOptions>, ChatModelOptions, BaseChatMemory> llmChain, required List<Tool<Object, ToolOptions, Object>> tools})
An agent powered by the tool calling API.
ToolsAgent.fromLLMAndTools({required BaseChatModel<ChatModelOptions> llm, List<Tool<Object, ToolOptions, Object>>? tools, BaseChatMemory? memory, SystemChatMessagePromptTemplate systemChatMessage = _systemChatMessagePromptTemplate, List<ChatMessagePromptTemplate>? extraPromptMessages})
Construct an ToolsAgent from an llm and tools.
factory

Properties

agentType String
Returns the type of agent.
no setteroverride
hashCode int
The hash code for this object.
no setterinherited
inputKeys Set<String>
Returns the input keys.
no setteroverride
llmChain LLMChain<BaseChatModel<ChatModelOptions>, ChatModelOptions, BaseChatMemory>
Chain to use to call the LLM.
final
returnValues Set<String>
Returns values of the agent.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
tools List<Tool<Object, ToolOptions, Object>>
The tools this agent can use.
finalinherited

Methods

noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
plan(AgentPlanInput input) Future<List<BaseAgentAction>>
Given the input and previous steps, returns the next action to take.
override
returnStoppedResponse(AgentEarlyStoppingMethod earlyStoppingMethod, List<AgentStep> intermediateSteps) AgentFinish
Return response when agent has been stopped due to max iterations.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited

Static Methods

createPrompt({SystemChatMessagePromptTemplate systemChatMessage = _systemChatMessagePromptTemplate, List<ChatMessagePromptTemplate>? extraPromptMessages, BaseChatMemory? memory}) BasePromptTemplate
Creates prompt for this agent.

Constants

agentInputKey → const String
The key for the input to the agent.