When to Use Local Models
This is a member-only chapter. Log in with your Signal Over Noise membership email to continue.
Log in to readModule 3 · Section 7 of 7
When to Use Local Models
The threshold for considering local models is lower than most people assume.
If you are:
- Processing documents that contain client names, contact details, or project specifics
- Working with any data covered by GDPR, HIPAA, or sector-specific regulations
- Handling information your organisation would be uncomfortable seeing in a data breach report
- Using AI to work with proprietary methods, pricing data, or competitive intelligence
Then a local model deserves serious consideration for that specific use case. Tools like Ollama with models like Mistral or Llama run adequately on a modern laptop. They are not as capable as GPT-4 or Claude 3.5. For many tasks — summarisation, drafting, answering questions about documents you provide — the capability difference does not matter for the job at hand.
The practical workflow is: use frontier models (Claude, GPT-4, Gemini) for tasks involving no sensitive data or already public information, and use local models for tasks where data sensitivity is a concern. Both can be running simultaneously and the choice becomes habitual quickly.
Module 4 is the practical one: specific things to do, check, and establish based on everything covered so far.