17 KiB
ICP Validation Plan: 2-Week Sprint
Date: October 20, 2025 Goal: Validate AI-powered developers as primary ICP through 5-10 external interviews Timeline: 2 weeks (Oct 20 - Nov 3, 2025) Success Criteria: 60%+ willing to use, 40%+ willing to pay $20+, 30%+ want early access
Why This Validation is Critical
Current status:
- ✅ Founder (Oleg) validated as ICP (personal pain confirmed)
- â³ Need 5-10 external confirmations (ensure not a "snowflake")
- â³ Test messaging, pricing, feature priorities
Risk if skipped:
- Build for imaginary users (not actual market)
- Wrong feature prioritization (waste 2-3 months)
- Pricing mismatch (too high or too low)
- Wrong channels (can't reach customers)
Investment: 15-20 hours over 2 weeks Payoff: 3-6 months saved on wrong direction
Validation Objectives
Primary Objectives (MUST VALIDATE)
- Pain confirmation: Other AI developers struggle with same image workflow bottleneck
- Willingness to pay: Ready to spend $10-50/month for solution
- Feature priority: MCP integration is desired (not just nice-to-have)
- Usage frequency: Build projects regularly enough to justify subscription/credits
Secondary Objectives (BONUS DATA)
- Messaging resonance: Which phrases/benefits resonate most
- Channel discovery: Where did validated users come from
- Pricing preference: Credits vs. subscription preference
- Competitor awareness: What alternatives they're currently using
Week 1: Outreach Setup + First Interviews
Day 1-2 (Mon-Tue, Oct 20-21): Prep & Launch
Tasks:
Setup Anonymous Accounts
- Create throwaway Reddit account (e.g.,
ai_dev_researcher,claude_code_user) - Create anonymous Indie Hackers account
- Join Discord servers:
- Claude AI official Discord
- Cursor Discord
- AI Tinkerers
- Next.js Discord
Time: 1-2 hours
Write Outreach Posts (3 variants)
Variant A: Problem-focused
Title: "Claude Code users: How do you handle images?"
Using Claude Code to build NextJS sites has been great, but images are still a pain. I end up:
1. Leaving Claude Code
2. Opening Gemini Studio/Midjourney
3. Generating manually
4. Downloading, organizing, importing
Takes longer than building the actual site sometimes.
How do you handle this? Any workflows that work well?
(Exploring this problem space, not selling anything yet)
Variant B: Solution validation
Title: "Would you use AI image generation directly from Claude Code?"
I've been using Claude Code for projects and love it, but image generation is a workflow killer.
Thinking about a tool that:
- Generates images via MCP (without leaving Claude Code)
- Returns production CDN URLs automatically
- Maintains consistency (logos, characters, etc.)
Would this solve a real problem for you? What's missing?
Variant C: Workflow comparison
Title: "AI-assisted devs: What's your image generation workflow?"
Quick survey for developers using AI coding tools (Claude Code, Cursor, etc.):
1. How do you generate images for your projects?
2. Where does it fit in your development workflow?
3. What's the most annoying part?
4. Would you pay to automate it?
Researching this space, want to understand actual pain points.
Decision: Test Variant A first (lowest friction), then B if low response
Time: 30 minutes
Post in Target Communities
Priority 1 (Post immediately):
- r/ClaudeAI (use Variant A)
- r/ChatGPTCoding (use Variant A or C)
Priority 2 (Post Day 2):
- r/nextjs (use Variant C - less AI-specific)
- Indie Hackers "Ask IH" (use Variant B)
Priority 3 (If needed):
- r/webdev (large but noisy)
- r/cursor (smaller community)
Time: 30 minutes (posting) + monitoring responses
Day 3-4 (Wed-Thu, Oct 22-23): First Interviews
Tasks:
Respond to Comments & DM Respondents
DM Template:
Hey! Saw your comment on [post title].
I'm doing deeper research on AI developer workflows and would love to hear more about your experience.
Would you be open to answering 5-7 questions via DM? Should take ~10 minutes.
(Not selling anything, genuinely trying to understand the problem space)
Response rate estimate: 50-60% will agree (text-based is low friction)
Time: 2-3 hours (responding + DMing 10-15 people)
Conduct Text-Based Interviews (Target: 3-5)
Interview Script (via DM):
Thanks for agreeing to chat! Here are the questions:
1. What AI coding tools do you use? (Claude Code, Cursor, other?)
2. How often do you build web projects? (Daily, weekly, monthly?)
3. Do you generate images for your projects? If yes, how?
4. What's the most annoying part of your current image workflow?
5. How much time does image handling take vs. actual coding?
6. Have you tried to automate this? What did you try?
7. If there was an MCP tool that let Claude Code generate + insert production-ready images automatically, would you use it?
8. What would you pay for that? ($0, $10-20, $20-50, $50+/month or one-time credit packs?)
No rush, answer when convenient. Really appreciate your input!
Documentation: After each interview, fill out:
- Pain severity (1-10)
- Willingness to pay (Yes/No/Maybe + amount)
- Key quotes
- Red flags / Green lights
Time: 3-5 hours (depending on response speed)
Day 5-7 (Fri-Sun, Oct 24-26): Additional Outreach + Interviews
Tasks:
Expand Outreach (If <5 interviews completed)
Option A: More Reddit posts
- r/IndieHackers
- r/SideProject (if applicable)
Option B: Discord engagement
- Post in #general or #tools channels (Claude AI Discord)
- Ask in #show-and-tell (if appropriate)
Option C: Direct DMs to active users
- Find recent posters in target subreddits
- DM with interview request
Time: 2-3 hours
Conduct More Interviews (Target: 5-7 total by end of Week 1)
Same script as Day 3-4
Analysis checkpoint:
- After 5 interviews, do initial pattern analysis
- Are 60%+ showing interest? (3 out of 5)
- Are 40%+ willing to pay $20+? (2 out of 5)
- If YES → continue
- If NO → reassess (wrong ICP? wrong messaging?)
Time: 3-5 hours
Week 2: Deep Dive + Decision
Day 8-10 (Mon-Wed, Oct 27-29): Final Interviews
Tasks:
Reach 10-15 Total Interviews
If Week 1 went well (5-7 completed):
- Conduct 3-5 more interviews
- Focus on validating specific insights from Week 1
If Week 1 was slow (<5 completed):
- Double down on outreach
- Try different messaging (Variant B or C)
- Consider voice calls (if willing to overcome English barrier)
Time: 4-6 hours
Pattern Analysis (Midpoint Review)
After ~8-10 interviews, analyze:
Quantitative:
- X% say "I would use this" (target: 60%+)
- X% willing to pay $20+ (target: 40%+)
- X% want early access (target: 30%+)
Qualitative:
- What words do THEY use to describe the problem?
- What features are mentioned most often?
- What objections/concerns come up repeatedly?
- What alternatives do they currently use?
Pricing insights:
- Credits vs. subscription preference?
- Price sensitivity ($10 vs $20 vs $50)?
- Usage patterns (episodic vs. regular)?
Time: 2-3 hours
Day 11-12 (Thu-Fri, Oct 30-31): Final Analysis
Tasks:
Complete Interview Analysis
Create summary document:
Validated Green Lights:
- Consistent pain point confirmed (describe)
- Willingness to pay validated (X% at $Y price)
- MCP integration desired (X% want it)
- Usage frequency sufficient (X build projects monthly+)
- Current workarounds inadequate (describe)
Red Flags Identified:
- List any recurring objections
- Note any patterns of disinterest
- Identify segments that DON'T fit
Messaging Insights:
- Best phrases/benefits (what resonated)
- Worst messaging (what confused/turned off)
- Technical details to emphasize
- Pain points to lead with
Channel Insights:
- Which channels yielded best respondents?
- Where are validated users most active?
- Which communities to prioritize for launch?
Time: 3-4 hours
Compile Validation Report
Structure:
1. Executive Summary
- ICP validated? (Yes / Pivot / Stop)
- Confidence level (High / Medium / Low)
- Key insights (3-5 bullets)
2. Interview Results
- Total interviews: X
- Green lights: X% (target: 60%+)
- Willing to pay $20+: X% (target: 40%+)
- Want early access: X% (target: 30%+)
3. Pain Point Confirmation
- Describe the validated pain in THEIR words
- Severity ranking (1-10 average)
- Current workarounds (what they do now)
4. Feature Priorities (from interviews)
- Must-have features (top 3)
- Nice-to-have features
- Features they don't care about
5. Pricing Insights
- Preferred model (credits vs. subscription)
- Price sensitivity analysis
- Budget availability confirmed
6. Messaging Recommendations
- Lead with: [validated pain point]
- Emphasize: [key benefits in their language]
- Avoid: [messaging that didn't resonate]
7. Next Steps
- If validated: Build MVP for this ICP
- If pivot needed: Test different segment
- If stop: Reassess fundamental assumptions
Time: 2-3 hours
Day 13-14 (Weekend, Nov 1-3): Decision + Planning
Tasks:
Go/Pivot/Stop Decision
Decision Matrix:
GO (Build MVP for AI developers):
- ✅ 60%+ willing to use
- ✅ 40%+ willing to pay $20+
- ✅ 30%+ want early access
- ✅ Consistent pain point
- ✅ Clear feature priorities
- ✅ Accessible channels identified
PIVOT (Test different ICP):
- âš ï¸ <60% willing to use
- âš ï¸ Interest but weak willingness to pay
- âš ï¸ Pain exists but not urgent
- âš ï¸ Conflicting feature requests
- âš ï¸ Budget concerns
STOP (Fundamental rethink):
- 🛑 <40% willing to use
- 🛑 No one willing to pay
- 🛑 Pain is theoretical, not actual
- 🛑 Current solutions adequate
- 🛑 Market doesn't exist
Time: 2-3 hours (discussion with @men)
If GO: Plan MVP Build
Next steps:
- Finalize MVP scope based on validated priorities
- Strip features to validated top 3 must-haves
- Set 4-6 week development timeline
- Plan beta launch to validated channels
- Prepare early access list from interviews
Time: 2-3 hours
If PIVOT: Identify Alternative ICP
Options:
- Test agencies (original hypothesis)
- Test e-commerce (product image use case)
- Test different developer segment (WordPress, etc.)
- Refine AI developer segment (different pain point)
Timeline: Another 1-2 weeks validation
Interview Documentation Template
Use this for EVERY interview:
Interview #: ___ Date: ___ Channel: (Reddit / Indie Hackers / Discord / DM) Contact: (username / email - for follow-up)
Q1: What AI coding tools do you use? Answer:
Q2: How often do you build web projects? Answer:
Q3: Do you generate images for your projects? How? Answer:
Q4: Most annoying part of current workflow? Answer:
Q5: Time spent on images vs. coding? Answer:
Q6: Tried to automate? What did you try? Answer:
Q7: Would you use MCP tool for image generation? Answer:
Q8: What would you pay? Answer:
ANALYSIS:
Pain Severity: ___/10 Willingness to Pay: [ ] Yes at $__ [ ] No [ ] Maybe Early Adopter: [ ] High [ ] Medium [ ] Low
Key Quotes:
Green Lights:
Red Flags:
Follow-up Notes:
Stealth Outreach Guidelines
To avoid employer discovery:
✅ DO:
- Use throwaway Reddit accounts (no connection to real identity)
- Use anonymous Indie Hackers account
- Use Discord with generic username
- Frame as "research" not "building a product"
- Focus on developer communities (not LinkedIn)
- Keep it casual, peer-to-peer tone
- Avoid mentioning current employer or location
⌠DON'T:
- Post on LinkedIn (colleagues will see)
- Use real name in Reddit/IH posts
- Mention "Thailand", "Frontend Developer", or other identifying details
- Reference current employer's product
- Post during work hours (timezone tells)
- Connect outreach accounts to personal social media
Success Criteria (Validation Complete)
ICP validated if ALL of these:
- 10-15 interviews completed
- 60%+ say "I would use this" (6+ out of 10)
- 40%+ willing to pay $20+ (4+ out of 10)
- 30%+ want early access (3+ out of 10)
- Consistent pain point (not scattered problems)
- Budget confirmed (they pay for other tools)
- Clear feature priorities (MCP + 2-3 others)
Strong bonus signals:
- Multiple people ask "When can I try this?"
- Referrals offered ("You should talk to...")
- Specific feature requests (they've thought deeply)
- Offer to beta test / provide feedback
- Share contact info for updates
Time Investment Summary
Week 1:
- Day 1-2: 2-3 hours (setup + posts)
- Day 3-4: 5-8 hours (DMs + 3-5 interviews)
- Day 5-7: 5-8 hours (more outreach + interviews)
- Total: 12-19 hours
Week 2:
- Day 8-10: 6-9 hours (final interviews + analysis)
- Day 11-12: 5-7 hours (report + decision)
- Day 13-14: 2-3 hours (planning next steps)
- Total: 13-19 hours
Grand Total: 25-38 hours over 2 weeks
This is 15-20 hours/week (matches your available time)
Contingency Plans
If Low Response Rate (<5 interviews by end of Week 1)
Options:
-
Try different messaging:
- Use Variant B or C
- More direct value prop
- Offer incentive (early access, discount)
-
Expand channels:
- Discord (more active communities)
- Twitter DMs (if comfortable)
- Dev.to (carefully, your existing account)
-
Voice calls (overcome language barrier):
- Offer 15-min video calls (record if permitted)
- Use real-time translation tools
- Prepare questions in advance
-
Leverage network:
- Ask developer friends
- Former colleagues (trusted)
- Online acquaintances
If Conflicting Feedback
Example: 50% love MCP, 50% don't care about MCP
Action:
- Segment respondents (who loves it vs. who doesn't?)
- Is there a clearer sub-ICP? (e.g., solo devs vs. team devs)
- Which segment has higher willingness to pay?
- Prioritize segment with strongest signals
If Validated Pain But No Willingness to Pay
Red flag: "I'd use it if it was free"
Possible issues:
- Price too high (test $10 tier)
- Value not clear (emphasize ROI)
- Wrong ICP (hobbyists, not professionals)
- Market too immature (wait 6-12 months)
Decision: PIVOT to different segment OR wait for market maturity
Post-Validation: Next Steps
If GO Decision:
Immediate actions (Week 3):
- Create validated MVP scope document (strip to top 3 features)
- Set development timeline (4-6 weeks)
- Prepare beta access list (from interviews)
- Design onboarding flow (MCP setup, first generation)
- Plan soft launch strategy (Reddit post in r/ClaudeAI)
Communication to interviewees:
Subject: Thanks for your input on AI dev workflow research
Hey [Name],
Thanks for taking time to chat about your development workflow a few weeks ago.
Based on feedback from you and others, I'm building a tool to solve the image generation bottleneck for AI-assisted developers.
Would you be interested in early access? I'll send an invite when it's ready (estimated [date]).
No pressure - just wanted to close the loop since you helped validate this.
Cheers,
Oleg
If PIVOT Decision:
Immediate actions (Week 3):
- Select alternative ICP hypothesis
- Prepare new interview script
- Identify channels for alternative ICP
- Run another 5-10 interviews (1-2 weeks)
- Set final deadline (if 2nd ICP fails, stop)
If STOP Decision:
Immediate actions:
- Document learnings (what didn't work)
- Preserve relationships (thank interviewees)
- Consider: Different problem? Different approach?
- Reassess: Continue as side project OR shut down?
- Focus energy back on day job (reduce stress)
Key Reminders
✅ DO:
- Be genuinely curious (not selling)
- Listen more than talk
- Ask "why" follow-up questions
- Document immediately after each interview
- Thank people for their time
⌠DON'T:
- Pitch the product (you're learning, not selling)
- Lead the witness ("Wouldn't you love X?")
- Ignore red flags (confirmation bias)
- Skip documentation (memory fades fast)
- Give up after 3-4 interviews (need 10+ for patterns)
Document owner: @men
Timeline: Oct 20 - Nov 3, 2025 (2 weeks)
Next milestone: Go/Pivot/Stop decision by Nov 3
Related docs: 07-validated-icp-ai-developers.md, 09-mvp-scope.md