AI

Function calling (tool use)

By Paul Brock·Updated on 24-04-2026
TL;DR

Function calling is the mechanism by which an LLM decides to invoke an external function or API with structured parameters, rather than generating only text.

Function calling transforms LLMs from text generators into action orchestrators. You define functions (with a JSON schema for parameters), pass them to the model, and the LLM can decide to invoke one (or more) based on the user's question. Claude, GPT-4 and Gemini support this natively. The next layer — MCP (Model Context Protocol) — standardises how functions are exposed to any model.

Example

A support chatbot has tools: get_order_status(order_id), create_support_ticket(issue), check_stock(product_id). Query 'where is my package?' → LLM calls get_order_status → receives JSON → formulates a natural-language reply.

Frequently asked questions

Function calling or agentic AI?

Function calling is a primitive. Agentic AI stacks loops, planning and memory on top. One function call = one action; an agent performs multiple calls in sequence with intermediate reflection.

What's the risk?

Uncontrolled actions if you give the model tools that delete files, send emails or make payments. Best practice: least privilege, human-in-the-loop for irreversible actions.

Related terms

Further reading

  • → Our service: GEO

Need help with SEO or GEO?

We help Bitcoin, AI and fintech companies get found in Google and in AI search engines.

Book a call