Skip to content
Free Tool Arena

Glossary · Definition

Tool use (AI)

Tool use (also called function calling) is the ability of an LLM to invoke external functions — web search, calculator, code execution, API calls — instead of just generating text. Returns the result for the model to incorporate.

Updated May 2026 · 4 min read
100% in-browserNo downloadsNo sign-upMalware-freeHow we keep this safe →

Definition

Tool use (also called function calling) is the ability of an LLM to invoke external functions — web search, calculator, code execution, API calls — instead of just generating text. Returns the result for the model to incorporate.

What it means

All major LLM APIs (Anthropic, OpenAI, Google, Mistral, DeepSeek) support tool use. The pattern: define tools as JSON schemas (name + description + params). The model decides whether to call a tool, what args to pass. The runtime executes, returns the result. The model continues the conversation. Most agent frameworks layer their abstractions on top of tool use.

Advertisement

Why it matters

Tool use is what turns an LLM from 'text generator' into 'agent that can do things.' Without tool use, the model can only produce text from its training data. With tool use, it can search the web, query databases, run code, send emails. Reliability depends on how well-described your tools are and whether the model can choose correctly.

Related free tools

Frequently asked questions

Which provider has best tool use?

Claude Sonnet 4.6 + Opus 4.7 are most reliable on long tool-using loops. GPT-5 is competitive. Gemini 2.5 Pro improved a lot in 2025-2026.

MCP vs tool use?

MCP is a protocol layer ON TOP of tool use that makes tools portable across clients. Native tool use is provider-specific; MCP is shared.

Related terms