Thinagents

Tools

Using Tools in Thinagents

Supercharge Your Agents with Tools

Tools are the secret sauce that make Thinagents agents truly powerful. With tools, your agent can do more than just chat—it can take action, fetch data, and even call other agents. In Thinagents, tools are simply Python functions that you register with your agent.

What Are Tools?

A tool is any callable (function or method) that you add to your agent. When the agent receives a prompt, it can decide to call one of its tools to help answer the question or complete a task. Tools can be as simple as a math function or as complex as a database query or API call.

Using Normal Functions as Tools

You can pass any regular Python function to the tools parameter when creating your agent. Thinagents will automatically inspect the function's signature and docstring to help the agent understand when and how to use it.

from thinagents import Agent

def add(a: int, b: int) -> int:
    "Add two numbers"
    return a + b

agent = Agent(
    name="Math Agent",
    model="openai/gpt-4o-mini",
    tools=[add],
)

response = agent.run("What is 2 + 3?")
print(response.content)  # Output: 5

Advanced: Decorated Functions for Richer Tools

For more advanced use cases, Thinagents supports decorated tools. By using the @tool decorator, you can provide extra metadata or control how the agent and LLM handle the tool's output. For example, you can use the return_type argument to specify how Thinagents should route the tool's return value, or the name argument to customize the tool's name.

from thinagents import Agent, tool

@tool(name="get_weather")
def get_weather(city: str) -> str:
    # Imagine this calls a real weather API
    return f"The weather in {city} is sunny."

agent = Agent(
    name="Weather Pro",
    model="openai/gpt-4o-mini",
    tools=[get_weather],
)

response = agent.run("What's the weather in Paris?")
print(response.content)

With the @tool decorator, you can:

  • Optionally provide a custom name for the tool using name
  • Control how Thinagents and the LLM handle the tool's output using return_type
  • (Advanced) Provide a Pydantic model for parameter validation and schema control using pydantic_schema

Notes

  • The return_type argument tells Thinagents how to route the tool's return value, especially for LLMs. It does not change how your function itself handles output.
  • If return_type="content" (the default), the tool's return value is sent directly to the LLM for the next step.
  • If return_type="content_and_artifact", your tool should return a tuple (content, artifact). The content (usually a summary or small result) is sent to the LLM, while the artifact (which can be a large dataset, file, or other object) is made available in the ThinagentResponse for downstream use, but is not sent to the LLM.

Example: Returning Content and Artifact

Sometimes, you want to return a large result (like a dataset or file) but only send a summary to the LLM. Use return_type="content_and_artifact" for this:

from thinagents import tool

@tool(return_type="content_and_artifact")
def summarize_and_return_data(query: str) -> tuple[str, dict]:
    # Imagine this fetches a large dataset and summarizes it
    data = {"rows": list(range(10000))}  # Large artifact
    summary = f"Found {len(data['rows'])} rows for query: {query}"
    return summary, data

response = agent.run("Summarize the data for X")
print(response.content)      # Sent to LLM: e.g., "Found 10000 rows for query: X"
print(response.artifact)     # Available for downstream use: the full data

Using Pydantic Schemas for Tool Parameters

You can use a Pydantic model to define and validate your tool's parameters. Pass the Pydantic class to the pydantic_schema argument.

from pydantic import BaseModel, Field
from thinagents import Agent, tool

class MultiplyInputSchema(BaseModel):
    """Multiply two numbers"""
    a: int = Field(description="First operand")
    b: int = Field(description="Second operand")

@tool(name="multiply_tool", pydantic_schema=MultiplyInputSchema)
def multiply(a: int, b: int) -> int:
    return a * b

agent = Agent(
    name="Math Agent",
    model="openai/gpt-4o-mini",
    tools=[multiply],
)

response = agent.run("What is 6 * 7?")
print(response.content)  # Output: 42
  • The tool's name will be multiply_tool (as given by the name argument).
  • The tool's description will be taken from the Pydantic class docstring.
  • The parameter schema will be enforced and described using the Pydantic model.
  • The schema will not include the Pydantic class name as a title, and the description will appear at the function level, not inside parameters.
{'tool_schema': {'type': 'function',
  'function': {'name': 'multiply_tool',
   'description': 'Multiply two numbers',
   'parameters': {'properties': {'a': {'description': 'First operand',
      'title': 'A',
      'type': 'integer'},
     'b': {'description': 'Second operand', 'title': 'B', 'type': 'integer'}},
    'required': ['a', 'b'],
    'type': 'object'}}},
 'return_type': 'content'}

Mix and Match

You can mix normal and decorated functions in your agent's toolset. Thinagents will handle both seamlessly.

from thinagents import Agent, tool

@tool(name="multiply")
def multiply(a: int, b: int) -> int:
    return a * b

def greet(name: str) -> str:
    return f"Hello, {name}!"

agent = Agent(
    name="Utility Agent",
    model="openai/gpt-4o-mini",
    tools=[multiply, greet],
)

You can use LangChain tools directly in Thinagents via the LangchainTool adapter. Here's how to use the Tavily web search tool:

from thinagents import Agent
from thinagents.tools import LangchainTool
from langchain_tavily import TavilySearch

agent = Agent(
    name="Web Search Agent",
    model="gemini/gemini-2.0-flash",
    tools=[LangchainTool(TavilySearch(max_results=5))],
    prompt="You are a web search assistant. Use Tavily to answer questions with up-to-date information."
)

print(agent.run("India Vs England 2nd Test 2025 results").content)

CrewAI Tool Example

You can use CrewAI tools directly in Thinagents via the CrewaiTool adapter. This allows you to wrap any CrewAI BaseTool and use it seamlessly in your agent, just like with LangChain tools.

from thinagents import Agent
from thinagents.tools import CrewaiTool
from crewai_tools import DirectoryReadTool, FileReadTool

# Create CrewAI tool instances
directory_tool = DirectoryReadTool('./../thinagents/core')
file_tool = FileReadTool()

# Wrap them with the ThinAgents adapter
wrapped_directory_tool = CrewaiTool(directory_tool)
wrapped_file_tool = CrewaiTool(file_tool)

# Use them in a ThinAgents agent
agent = Agent(
    name="File System Agent",
    model="openai/gpt-4o-mini",
    tools=[wrapped_directory_tool, wrapped_file_tool]
)

# The agent can now use the CrewAI tools
response = agent.run("explain about agent.py file in thinagents/core")
print(response.content)

With CrewaiTool, you can:

  • Use any CrewAI tool (sync or async) as a ThinAgents tool
  • Automatically generate the correct parameter schema for the agent
  • Mix CrewAI tools with regular Python, LangChain, and Agno tools in your agent's toolset

Tip

You can mix and match any number of CrewAI, LangChain, Agno, or regular Python tools in your agent's toolset.

Agno Tool Example: YFinance

You can use Agno toolkits in Thinagents via the AgnoTool adapter. Here's how to use the YFinance toolkit for financial data:

from thinagents import Agent
from thinagents.tools import AgnoTool
from agno.tools.yfinance import YFinanceTools

agent = Agent(
    name="Finance Agent",
    model="gemini/gemini-2.0-flash",
    tools=[*AgnoTool(YFinanceTools(enable_all=True)).get_tools()],
    prompt="You are a finance assistant. Use YFinance tools to answer stock and finance questions."
)

print(agent.run("Current stock price for Microsoft").content)

Combo Example: Python, LangChain, Agno, and CrewAI Tools

You can combine your own Python tools, LangChain tools, Agno toolkits, and CrewAI tools in a single agent for maximum versatility:

from thinagents import Agent
from thinagents.tools import LangchainTool, AgnoTool, CrewaiTool
from langchain_tavily import TavilySearch
from agno.tools.yfinance import YFinanceTools
from crewai_tools import DirectoryReadTool, FileReadTool
import datetime

def get_current_datetime() -> str:
    return datetime.datetime.now().isoformat()

tools = [
    get_current_datetime,  # Custom Python function
    LangchainTool(TavilySearch(max_results=5)),  # LangChain web search tool
    *AgnoTool(YFinanceTools(enable_all=True)).get_tools(),  # Agno finance tools
    CrewaiTool(DirectoryReadTool('./../thinagents/core')),  # CrewAI directory tool
    CrewaiTool(FileReadTool())  # CrewAI file tool
]

agent = Agent(
    name="Universal Assistant",
    model="gemini/gemini-2.0-flash",
    tools=tools,
    prompt="You are a universal assistant with tools for datetime, web search, and finance."
)

print(agent.run("What is the current server time?").content)
print(agent.run("India Vs England 2nd Test 2025 results").content)
print(agent.run("Current stock price for Microsoft").content)
print(agent.run("explain about agent.py file in thinagents/core").content)

Tip

You can mix and match any number of CrewAI, LangChain, Agno, or regular Python tools in your agent's toolset.

Tool Parameters

PropTypeDefault
name?
str
None
return_type?
content | content_and_artifact
content
pydantic_schema?
BaseModel
None