LangChain/Tools
★ OverviewIntermediate10 min

Tools: Give Your LLM Arms

The @tool decorator, BaseTool, tool schemas from docstrings, bind_tools(), and the tool-call message cycle.

Quick Reference

  • @tool decorator turns any Python function into a tool — docstring becomes the schema
  • model.bind_tools([tool_list]) teaches the model about available tools
  • Tool calls appear as AIMessage.tool_calls — not as text output
  • ToolMessage carries results back to the model with the correct tool_call_id
  • LangChain supports Anthropic, OpenAI, and Gemini tool calling with one API

What Tools Are

The tool calling cycle

A tool is a function the LLM can decide to call. LangChain converts your function's signature and docstring into a JSON schema the model understands. The model returns a tool_calls request, your code executes it, and the result goes back to the model as a ToolMessage. The model never executes code directly — it only requests calls.

LLM (ChatModel)Decides: call a tool or respond?.bind_tools([search, calc, ...])tool_callsTool Executionweb_search()calculator()api_call()Executes requested tool, returns resultToolMessageLoopuntil donefinal answerResponseDelivered to user

The tool calling cycle: LLM requests tools, tools return results, LLM decides next step