Remember Alfred, our helpful butler agent from earlier? Well, he’s about to get an upgrade! Now that we understand the tools available in LlamaIndex, we can give Alfred new capabilities to better serve us.
But before we enhance Alfred’s abilities, let’s remind ourselves what makes an agent like Alfred tick. Back in Unit 1, we learned that:
An Agent is a system that leverages an AI model to interact with its environment in order to achieve a user-defined objective. It combines reasoning, planning, and the execution of actions (often via external tools) to fulfill tasks.
LlamaIndex supports three main types of reasoning agents:
An agent is initialised from a set of Tools. Here’s an example of instantiating a ReAct agent from a set of Tools.
from llama_index.core.tools import FunctionTool
from llama_index.llms.huggingface_api import HuggingFaceInferenceAPILM
from llama_index.core.agent import ReActAgent
# define sample Tool
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
multiply_tool = FunctionTool.from_defaults(fn=multiply)
# initialize llm
llm = HuggingFaceInferenceAPILM(model_name="meta-llama/Meta-Llama-3-8B-Instruct")
# initialize ReAct agent
agent = ReActAgent.from_tools([multiply_tool], llm=llm, verbose=True)Similarly, we can use the AgentRunner to automatically pick the best agent reasoning flow depending on the LLM.
from llama_index.core.agent import AgentRunner
agent_runner = AgentRunner.from_llm(llm, verbose=True)Agent supports both chat and query endpoints with query() and chat(), where chat interactions keep a history of messages.
response = agent.query("What is 2 times 2?")Now we’ve gotten the basics, let’s take a look at how we can use tools in our agents.
It is easy to wrap QueryEngine as tools for an agent.
When doing so, we need to define a name and description within the ToolMetadata. The LLM will use this information to correctly use the tool.
Let’s see how to load in a QueryEngineTool using the QueryEngine we created in the component section.
from llama_index.core.tools import QueryEngineTool, ToolMetadata
query_engine = index.as_query_engine(similarity_top_k=3) # as shown in the previous section
query_engine_tool = QueryEngineTool(
query_engine=query_engine,
metadata=ToolMetadata(
name="a specific name",
description="a specific description",
),
return_direct=False,
)
query_engine_agent = ReActAgent.from_tools([query_engine_tool], llm=llm, verbose=True)Agents in LlamaIndex can directly be used as tools for other agents by loading them as a QueryEngineTool.
from llama_index.core.tools import QueryEngineTool
query_engine_agent = # as defined in the previous section
query_engine_agent_tool = QueryEngineTool(
query_engine=query_engine_agent,
metadata=ToolMetadata(
name="a specific name",
description="a specific description",
),
)
multi_agent = ReActAgent.from_tools([query_engine_agent_tool], llm=llm, verbose=True)Now that we understand the basics of agents and tools in LlamaIndex, let’s see how we can use LlamaIndex to create configurable and manageable workflows!
< > Update on GitHub