NPS

Master this essential documentation concept

Quick Definition

Net Promoter Score - a customer loyalty metric that measures the likelihood of customers recommending a product or service to others, typically on a scale of 0-10

How NPS Works

flowchart TD A[User Completes Documentation Task] --> B[NPS Survey Triggered] B --> C["Rate likelihood to recommend (0-10)"] C --> D{Score Classification} D -->|0-6| E[Detractors] D -->|7-8| F[Passives] D -->|9-10| G[Promoters] E --> H[Follow-up: What frustrated you?] F --> I[Follow-up: What would improve this?] G --> J[Follow-up: What worked well?] H --> K[Content Improvement Backlog] I --> K J --> L[Best Practice Documentation] K --> M[Prioritized Updates] L --> N[Template Creation] M --> O[Re-measure NPS] N --> O

Understanding NPS

Net Promoter Score (NPS) is a widely-adopted metric that helps documentation teams understand user satisfaction and loyalty by asking one simple question: "How likely are you to recommend this documentation to a colleague?" Users respond on a scale of 0-10, with scores categorized into Detractors (0-6), Passives (7-8), and Promoters (9-10).

Key Features

  • Simple single-question survey format that reduces user friction
  • Standardized 0-10 scale for consistent measurement across teams
  • Automatic categorization of responses into three user segments
  • Benchmarkable score calculation (% Promoters - % Detractors)
  • Optional follow-up questions for qualitative insights

Benefits for Documentation Teams

  • Quantifies user satisfaction with specific documentation sections
  • Identifies high-performing content that can be replicated
  • Reveals pain points requiring immediate attention
  • Provides data-driven justification for documentation investments
  • Tracks improvement trends over time
  • Enables comparison between different documentation formats or topics

Common Misconceptions

  • NPS alone provides complete user satisfaction insights (needs qualitative data)
  • All documentation should achieve the same NPS targets
  • Higher scores always indicate better documentation quality
  • NPS surveys should be deployed on every page or interaction

Turning NPS Feedback Videos into Actionable Documentation

When collecting Net Promoter Score (NPS) feedback, your team likely conducts video interviews or records customer feedback sessions where users explain their ratings. These videos contain invaluable insights about why promoters love your product and why detractors struggle with it.

However, video-only NPS feedback creates significant challenges. Key insights remain trapped in hours of recordings, making it difficult to identify patterns, share findings with product teams, or track improvements over time. When a product manager asks, "What are the top three reasons for low NPS scores?", finding those answers in scattered video files becomes a time-consuming process.

Converting NPS feedback videos into searchable documentation solves this problem. By transforming customer interviews into structured text, you can quickly categorize feedback by score range, identify common pain points from detractors (0-6), understand what satisfies passives (7-8), and learn what delights promoters (9-10). This documentation becomes a searchable knowledge base that helps product teams prioritize improvements that will directly impact your NPS metrics.

Real-World Documentation Use Cases

API Documentation User Journey Optimization

Problem

Developers struggle with API documentation, leading to increased support tickets and delayed integrations

Solution

Deploy NPS surveys at key completion points in API documentation workflows to identify friction points

Implementation

1. Add NPS widgets after code examples, authentication guides, and endpoint references 2. Segment responses by developer experience level and use case 3. Follow up with detractors to understand specific pain points 4. Track NPS trends as documentation improvements are implemented

Expected Outcome

Improved developer experience, reduced support burden, and faster API adoption rates

Knowledge Base Content Performance Measurement

Problem

Unable to determine which help articles are truly solving user problems versus just being visited frequently

Solution

Implement NPS scoring on help articles to measure satisfaction beyond page views and time-on-page metrics

Implementation

1. Add NPS surveys to high-traffic help articles 2. Compare NPS scores with traditional metrics like bounce rate 3. Identify articles with high traffic but low NPS for revision 4. Use promoter feedback to understand what makes content exceptional

Expected Outcome

Data-driven content strategy focusing on user satisfaction rather than just engagement metrics

Onboarding Documentation Effectiveness Tracking

Problem

New users complete onboarding flows but struggle with product adoption, suggesting documentation gaps

Solution

Use NPS to measure satisfaction at each onboarding milestone and identify where users lose confidence

Implementation

1. Deploy micro-NPS surveys after each onboarding section 2. Create cohort analysis comparing NPS scores to long-term user retention 3. A/B test different documentation approaches and measure NPS impact 4. Build feedback loops from low-scoring sections back to content creators

Expected Outcome

Higher user activation rates and reduced churn during critical early-use periods

Technical Documentation Accessibility Assessment

Problem

Documentation serves users with varying technical expertise, but it's unclear if content complexity matches audience needs

Solution

Segment NPS responses by user role and experience level to optimize content for different audiences

Implementation

1. Include role and experience questions alongside NPS surveys 2. Analyze score variations between beginner, intermediate, and advanced users 3. Identify content that works well for all levels versus specialized content needs 4. Create user-specific documentation paths based on NPS insights

Expected Outcome

Personalized documentation experiences that serve diverse user needs effectively

Best Practices

Time NPS Surveys Strategically

Deploy NPS surveys immediately after users complete meaningful documentation tasks rather than randomly or on page load

✓ Do: Trigger surveys after task completion, successful code implementation, or problem resolution
✗ Don't: Don't interrupt users mid-task or survey immediately upon page arrival before they've engaged with content

Segment Responses for Actionable Insights

Collect contextual information alongside NPS scores to understand why different user groups provide different ratings

✓ Do: Ask about user role, experience level, specific use case, and documentation section accessed
✗ Don't: Don't treat all NPS responses equally without understanding the user context behind the score

Follow Up with Qualitative Questions

Use the NPS score as a starting point for deeper conversations about documentation effectiveness

✓ Do: Ask specific follow-up questions tailored to detractors, passives, and promoters for targeted insights
✗ Don't: Don't rely solely on numerical scores without understanding the reasoning behind user ratings

Create Feedback Loops to Content Teams

Establish clear processes for translating NPS insights into documentation improvements and measuring impact

✓ Do: Create regular review cycles, assign ownership for low-scoring content, and track improvement trends
✗ Don't: Don't collect NPS data without clear processes for acting on the feedback and closing the loop with users

Benchmark Against Documentation-Specific Standards

Compare NPS scores against other documentation rather than general product NPS benchmarks for realistic expectations

✓ Do: Research industry-specific documentation NPS ranges and set realistic improvement targets
✗ Don't: Don't expect documentation NPS to match product NPS scores, as the contexts and user expectations differ significantly

How Docsie Helps with NPS

Build Better Documentation with Docsie

Join thousands of teams creating outstanding documentation

Start Free Trial