Classification
This is a member-only chapter. Log in with your Signal Over Noise membership email to continue.
Log in to readModule 3 · Section 3 of 6
Classification
The most common AI task in a workflow is classification: given this piece of text, which category does it belong to?
Examples:
- Is this contact form submission a sales enquiry, a support request, or spam?
- Is this RSS item relevant to my project or not?
- Is this log message an error, a warning, or informational?
The Basic LLM Chain node handles this well. Connect it after the node that produces the text you want to classify. Set up a system prompt like:
You are a classifier. Given the following message, return exactly one of these labels: sales, support, spam. Return only the label. No explanation.
Pass the text as the user message using an expression:
{{ $json.body.message }}
The key instruction is “return only the label”. LLMs will explain themselves if you let them. For classification inside a workflow you need a clean output you can route on, not a paragraph about what the model decided.
After the LLM Chain node, add a Switch node. Route based on the output: if the result is sales, go one way. If it’s support, go another. If it’s spam, end the workflow or log it quietly.
Connecting to Claude
To use Claude, create an Anthropic credential in n8n settings. You’ll need an API key from console.anthropic.com. In the LLM Chain node, set the model to “Anthropic Chat Model” and select the Claude model you want. Haiku is fast and cheap for classification. Sonnet for tasks that need more reasoning.
Connecting to GPT
Create an OpenAI credential. Same pattern — set the model to “OpenAI Chat Model” in the node configuration.
Connecting to a local model
If you’re running Ollama locally, n8n has an Ollama integration. Create an Ollama credential pointing to http://localhost:11434. Set the model to whatever you’ve pulled — llama3, mistral, qwen2.5-coder. Local models are slower than hosted APIs but cost nothing per call and keep your data local.
I use local classification for content routing where the data is sensitive. The classification request never leaves the machine.