AI & LLMs · Guide · AI & Prompt Tools
How to Use Marvin AI
Installing marvin, classify/extract/generate, ai_fn decorator, Pydantic schemas, and backends.
Marvin turns Python functions into structured AI tools without the usual prompt-engineering boilerplate.
Advertisement
Marvin is a Python toolkit from the Prefect team that wraps LLM calls in typed, reliable primitives. Instead of hand-writing prompts and parsing responses, you decorate a function signature and Marvin handles the schema coercion, retries, and validation. It feels like regular Python, which is the point.
What it is
Marvin exposes a small set of primitives — classify, extract, generate, cast, and the @ai_fn decorator — that delegate to an LLM backend (OpenAI by default) and return Pydantic-typed results. Under the hood it generates a system prompt from the type signature and docstring, calls the model, and parses JSON back into Python objects.
Install / set up
# install pip install marvin export OPENAI_API_KEY=sk-...
First run
The fastest demo is classification. Give Marvin a string and a list of labels, and it returns the best match as a typed value.
$ python -c "import marvin; print(marvin.classify('This is amazing!', ['positive','negative']))"
positiveEveryday workflows
- Wrap a business rule with
@ai_fnso product code calls a typed function instead of an LLM directly. - Use
marvin.extractto pull entities (emails, prices, dates) out of unstructured support tickets. - Chain
marvin.generatewith Pydantic models to fabricate test fixtures that match your schema.
Gotchas and tips
Marvin is thin on purpose. It does not ship a vector store, a chat UI, or an agent loop — if you need those, pair it with LangChain or ControlFlow (also from Prefect). Treat Marvin as the “typed function” layer and compose it with heavier frameworks.
Token cost is easy to underestimate because every decorated call round-trips to the model. Cache aggressively in production, and prefer classify over @ai_fn when the output space is small and known — it’s cheaper and more deterministic.
Who it’s for
Python developers who want LLM features inside existing services without adopting a full agent framework. If you already live in FastAPI, Prefect, or a data pipeline and you need structured outputs today, Marvin slots in cleanly.
Advertisement