Metrics Explained
Understanding your brand's AI performance metrics is crucial for making data-driven decisions. This comprehensive guide explains every metric in your RankLLM dashboard.
Core Performance Metrics
Mention Rate
The percentage of relevant queries where your brand is mentioned by AI systems
How It's Calculated
(Brand Mentions ÷ Total Relevant Queries) × 100
Good Range
15-40%
Interpretation Guide
Sentiment Score
Average sentiment of mentions across all AI platforms, scored from 0-10
How It's Calculated
Weighted average of positive, neutral, and negative mentions
Good Range
7.0-9.0
Interpretation Guide
Position Average
Average position where your brand appears in AI response lists
How It's Calculated
Sum of all positions ÷ number of mentions
Good Range
1.0-3.0
Interpretation Guide
Visibility Score
Overall AI visibility score combining mention rate, sentiment, and position
How It's Calculated
Proprietary algorithm weighing multiple factors
Good Range
70-100
Interpretation Guide
Performance Scoring Ranges
Mention Rate
Sentiment Score
Position Average
Visibility Score
Contextual Analytics
Trend Analysis
Performance changes over time periods (daily, weekly, monthly)
Use Case:
Identify patterns, measure campaign impact, spot emerging issues
Key Insights:
- Growth trajectories
- Seasonal patterns
- Campaign effectiveness
Platform Breakdown
Performance across different AI platforms (ChatGPT, Claude, Gemini, etc.)
Use Case:
Understand platform-specific strengths and weaknesses
Key Insights:
- Platform preferences
- Optimization opportunities
- Audience differences
Query Context
Types of questions and contexts where your brand appears
Use Case:
Optimize content for high-value use cases
Key Insights:
- Use case alignment
- Content gaps
- Positioning opportunities
Response Timing
How quickly mentions appear after content changes or events
Use Case:
Measure content optimization and PR impact speed
Key Insights:
- Content freshness impact
- News cycle effects
- Update propagation
How to Interpret Your Data
Compare Relative Performance
Your metrics are most meaningful when compared to competitors in your industry
Example: A 20% mention rate might be excellent in a niche B2B market but average in consumer tech
Look for Trends, Not Just Numbers
Direction of change is often more important than absolute values
Example: A sentiment score improving from 6.5 to 7.2 over 3 months shows positive momentum
Consider External Factors
News events, product launches, and market changes affect metrics
Example: A temporary dip in sentiment after a product issue doesn't indicate long-term problems
Segment Your Analysis
Break down metrics by platform, time period, or query type for deeper insights
Example: You might have strong performance on ChatGPT but opportunities on newer platforms
Metrics-Driven Action Framework
Use this framework to turn your metrics into actionable improvements:
📊 Analyze
- • Review all core metrics monthly
- • Compare to previous periods
- • Identify biggest gaps vs competitors
🎯 Prioritize
- • Focus on lowest-performing metrics first
- • Consider business impact of improvements
- • Set realistic improvement targets
🚀 Optimize
- • Implement content improvements
- • Adjust monitoring strategies
- • Test different approaches
📈 Measure
- • Track changes after optimizations
- • Document what works
- • Iterate based on results