This is a member-only chapter. Log in with your Signal Over Noise membership email to continue.
Log in to readChapter 2: The PAST Framework for AI Success
Why Most AI Projects Fail Before They Start
Four elements determine whether an AI project succeeds or fails — and none of them are the technology. Miss any one, and your project will struggle no matter how good the tools are.
The PAST Framework gives you the strategic foundations before you spend a penny on tools:
- Purpose: Why are you implementing AI?
- Audience: Who will use and benefit from AI solutions?
- Scope: What boundaries will guide your implementation?
- Tone: How will AI match your voice and working style?
What makes PAST powerful is its versatility. The same framework that guides enterprise AI strategy also works for:
- Designing effective prompts: Purpose (what outcome?), Audience (who reads this?), Scope (what’s included?), Tone (formal or conversational?)
- Personal workflow optimisation: Purpose (why this automation?), Audience (who benefits?), Scope (which tasks?), Tone (how does this fit my work style?)
- Content creation: Purpose (what message?), Audience (who’s reading?), Scope (what topics?), Tone (brand voice?)
This multi-level application is what makes PAST a thinking framework, not just a corporate strategy checklist.
Purpose: Defining Clear Business Outcomes
The Wrong Question: “How can we use AI?” The Right Question: “What business outcomes do we need to achieve?”
Most AI implementations start with the technology and work backward to business value. This creates solutions looking for problems.
Start with specific, measurable outcomes. At whatever scale you’re working:
Revenue Growth: “Increase proposals sent per month from 4 to 8 by using AI to cut first-draft time in half” Time Recovery: “Reduce client status update emails from 2 hours/week to 30 minutes using a templated AI workflow” Quality: “Improve first-draft acceptance rate from 60% to 80% by building AI prompts around my client’s brief format” For organisations: “Reduce customer service response time by 50% while maintaining satisfaction scores”
At the prompt engineering level, define what the output needs to accomplish:
❌ Bad: “Write about AI implementation” ✅ Good: “Create a 300-word executive summary explaining why 95% of AI pilots fail and what our organisation can do differently, emphasizing ROI concerns”
At the personal productivity level, identify the specific problem:
❌ Bad: “Automate my email” ✅ Good: “Reduce time spent on repetitive client status updates from 2 hours/week to 15 minutes while maintaining personalisation”
Purpose Clarity Exercise:
Complete this statement for each AI initiative: “Success means [specific measurable outcome] which will [business impact] by [timeline] as measured by [metric].”
Example: “Success means reducing invoice processing time from 48 hours to 4 hours, which will free up half a working day per week for client work by the end of Q2, as measured by average processing time.”
The Metric Mandate
If you can’t put a number on your AI goal, you don’t have a goal — you have a wish.
Projects without clear metrics can’t fail, because there’s no definition of failure. They drift, consuming money while delivering “lessons learned” instead of results.
Before any AI project leaves the Purpose phase:
| Requirement | Must Answer |
|---|---|
| Named business metric | What specific number will change? |
| Current baseline | What is that number today? |
| Success threshold | What’s the minimum improvement to justify investment? |
| Measurement timeline | When will we measure? |
| Kill criteria | What result means we stop? |
If any field is blank or “TBD”, the project is not ready.
A project that can’t answer all five isn’t ready for Audience analysis. Return to Purpose.
Audience: Understanding Who AI Serves
AI implementations fail when they’re designed for technology buyers instead of actual users.
At the organisational level, map your AI stakeholders:
Primary Users: Who will interact with AI tools daily? Secondary Users: Who will receive AI outputs or insights? Decision Makers: Who controls budgets and strategic direction? IT/Security: Who ensures compliance and integration? (If that’s you: great — one fewer meeting.)
For each audience, understand:
Current Workflow: How do they work now? Pain Points: What frustrates them most? Success Metrics: How do they measure effectiveness? Change Readiness: How comfortable are they with new tools? Required Training: What support will they need?
At the prompt engineering level, consider who reads the output:
- Technical team reading code documentation needs precision and detail
- Executives reading strategy summaries need conciseness and business impact framing
- Customers reading support responses need empathy and clarity without jargon
- Internal teams reading process instructions need step-by-step specificity
At the personal productivity level, think about output consumers:
- Email to clients requires professional tone and context sensitivity
- Internal notes can be more casual and abbreviated
- Documentation for team needs to work for different skill levels
- Your own reference materials can use personal shorthand
Audience Alignment Questions:
- Does this AI solution make their job easier or harder?
- Will they see immediate value or only long-term benefits?
- How will this change their daily routines?
- What resistance should we expect and plan for?
- How will we measure adoption and satisfaction?
Scope: Setting Realistic Boundaries
AI scope creep kills more projects than technical failures.
Every AI implementation needs clear boundaries around:
Functional Scope: Which business processes are included? Data Scope: What information will AI systems access? User Scope: Who can use these tools and for what purposes? Decision Scope: What decisions can AI make autonomously vs. require human approval? Timeline Scope: What’s the implementation roadmap and key milestones?
At the organisational level, start narrow and expand systematically:
Phase 1: Single use case, limited user group, specific data set Phase 2: Adjacent use cases, broader user group, expanded data access Phase 3: Cross-functional integration, organisation-wide access, strategic decision support
At the prompt engineering level, define constraints:
- Word count limits (300-word summary vs. comprehensive analysis)
- Information sources (internal data only vs. web research allowed)
- Output format (bullet points, paragraph form, structured JSON)
- Level of detail (executive overview vs. technical deep-dive)
- Creative freedom (strict adherence to facts vs. interpretive analysis)
At the personal productivity level, set boundaries:
- Which tasks get automated vs. remain manual
- How much time you’ll invest in setup vs. ongoing maintenance
- What complexity level you’re comfortable managing
- Where human judgment remains essential vs. algorithmic decision acceptable
Scope Definition Template:
In Scope:
- Specific processes: [List exact workflows]
- User groups: [Define exactly who can access]
- Data sources: [Specify what information AI can use]
- Decision authority: [Clarify what AI can decide vs. recommend]
Out of Scope:
- Excluded processes: [What you’re NOT automating]
- Restricted users: [Who cannot access AI tools]
- Protected data: [Information AI cannot access]
- Human-only decisions: [What requires human judgment]
The Capability Trap Connection:
Remember: BCG found that AI leaders pursue half as many opportunities but scale more than twice as many successfully. Tight scope isn’t limitation — it’s strategic focus that enables deep implementation.
Tone: Making AI Sound Like You
AI implementations succeed when the output sounds like something you’d actually say. They fail when every email, proposal, and social post reads like it came from the same generic robot.
But there’s a critical challenge: AI-generated content can sound generic, indistinguishable from everyone else’s AI output. This “AI slop” problem undermines trust, damages your reputation, and signals lazy thinking to your clients.
Know your own style first:
How risk-tolerant are you? If you prefer to test things quietly before going public, build that into your AI workflow — draft internally first, review carefully, then publish. If you move fast and iterate, your AI setup can match that pace.
How do you communicate? Formal proposals and client reports need AI that matches a professional register. Casual emails and social posts need AI that sounds like you on a good day, not a corporate brochure.
How quickly do you adopt new tools? Be honest with yourself. If you’re the type who needs to sit with something for a week before committing, don’t force yourself into a “try five tools this weekend” approach.
Avoiding AI Slop in Your Work
Now that AI-generated content is everywhere, maintaining a distinct voice is a real competitive advantage. Here’s how to ensure your outputs don’t sound like everyone else’s.
The AI Slop Detection Framework:
Generic Superlatives = Red Flag
AI loves phrases like “impressive company” or “industry-leading solutions” because they’re universally applicable. Your real voice mentions specific things: “your Q3 product launch addressing mid-market banking compliance” instead of “your innovative solutions.”
If your AI-generated content could be sent to any business in your industry with minimal changes, you’re producing AI slop.
The Perfect Grammar Paradox
AI produces flawless writing with zero original insights, while humans make occasional typos but demonstrate real understanding. Don’t chase grammatical perfection — chase authentic insight and industry-specific knowledge.
The Flattery Sandwich Pattern
AI slop follows this pattern religiously: compliment, generic pitch, another compliment.
Example: “Love what you’re building. We help companies scale through proven strategies. Would love to discuss how we can support your impressive work.”
Authentic voice follows: specific observation, relevant solution, clear next step.
Example: “I noticed your Q3 implementation reduced customer support response time by 40% while maintaining NPS scores. Based on that systematic approach, our escalation workflow automation could eliminate the remaining manual handoffs in your tier-2 support process.”
Maintaining Your Authentic Voice
Specificity Over Polish: Value your industry knowledge and specific details over perfect grammar and generic compliments. “I helped a client reduce regulatory reporting time from 80 hours to 12 hours using automated data validation” beats “I help companies achieve remarkable efficiency improvements.”
Challenge Generic Outputs: When reviewing AI-generated content, ask: “Could this exact text have been written by anyone in my industry?” If yes, rewrite with specifics. If no, you’ve maintained differentiation.
Industry Knowledge Integration: AI should enhance your domain expertise, not replace it. Generic AI output + your specific industry knowledge = your authentic voice. The AI provides structure and efficiency; you provide the insights that matter.
Verification Protocols: Build a quick review step into your process to catch AI slop before it goes out:
- Does this mention specific client outcomes, or generic success metrics?
- Does this demonstrate actual industry knowledge, or template insights?
- Could a competitor send essentially the same message?
- Does this reflect what you can actually do, or universal claims?
Tone Setting Questions
- How do you personally adopt new tools — slowly and carefully, or fast and iterative?
- How formal or informal does your work typically feel?
- What level of autonomy are you comfortable giving automated processes?
- How should AI-generated content handle errors or uncertainty?
- What knowledge should always come from you, not an algorithm?
- Where does your reputation depend on sounding like yourself, not a template?
PAST Framework Application Example
Company: Mid-size consulting firm implementing AI for research and proposal development
Purpose: Reduce proposal development time by 40% while improving quality and win rates through better research synthesis and content personalisation.
Audience:
- Primary: Senior consultants who write proposals
- Secondary: Partners who review and approve proposals
- Decision Makers: Practice leaders who control budgets
- IT: Must ensure client data security
Scope:
- In: Research synthesis, competitive analysis, proposal section drafting
- Out: Client relationship management, final proposal approval, pricing decisions
- Phase 1: One practice area, 5 senior consultants, standard proposal templates
- Phase 2: All practice areas, proposal presentation support
- Phase 3: Integration with CRM and project management systems
Tone:
- Professional but approachable (matches consulting culture)
- Supports human expertise rather than replacing it
- Maintains high-quality standards expected by clients
- Respects confidentiality and client-specific customisation needs
- Avoids AI slop: Proposals reference specific client challenges, not generic business compliments
- Stands out: The firm’s industry knowledge stays central — AI handles efficiency, not insights
Result: A clear plan that matches how you actually work, delivers results you can track, and keeps your client communication authentic.
The PAST Framework doesn’t tell you which tools to buy. It tells you what you’re actually trying to accomplish — and that’s the question most people never ask before they start spending.