AI-Powered Help Center

RankLLM Knowledge Base

Get instant answers from our comprehensive knowledge base

Analytics Guide

Metrics Explained

Understanding your brand's AI performance metrics is crucial for making data-driven decisions. This comprehensive guide explains every metric in your RankLLM dashboard.

📊 10 min read📈 Intermediate level🎯 Essential knowledge

Core Performance Metrics

Mention Rate

The percentage of relevant queries where your brand is mentioned by AI systems

How It's Calculated

(Brand Mentions ÷ Total Relevant Queries) × 100

Good Range

15-40%

Interpretation Guide

High: Strong AI visibility - your brand is frequently referenced
Medium: Moderate visibility - room for optimization
Low: Limited visibility - needs improvement

Sentiment Score

Average sentiment of mentions across all AI platforms, scored from 0-10

How It's Calculated

Weighted average of positive, neutral, and negative mentions

Good Range

7.0-9.0

Interpretation Guide

High: Very positive brand perception in AI responses
Medium: Generally positive with some neutral mentions
Low: Mixed or negative sentiment needs attention

Position Average

Average position where your brand appears in AI response lists

How It's Calculated

Sum of all positions ÷ number of mentions

Good Range

1.0-3.0

Interpretation Guide

High: Consistently mentioned first - top-of-mind for AI
Medium: Regular mentions but not always prioritized
Low: Often mentioned later in responses

Visibility Score

Overall AI visibility score combining mention rate, sentiment, and position

How It's Calculated

Proprietary algorithm weighing multiple factors

Good Range

70-100

Interpretation Guide

High: Excellent overall AI presence
Medium: Good visibility with optimization opportunities
Low: Significant improvement needed

Performance Scoring Ranges

Mention Rate

30%+
Industry-leading visibility
15-30%
Strong market presence
5-15%
Moderate visibility
<5%
Limited recognition

Sentiment Score

8.5+
Overwhelmingly positive
7.0-8.5
Generally positive
5.0-7.0
Mixed sentiment
<5.0
Negative perception

Position Average

1.0-1.5
Top-of-mind leader
1.5-3.0
Frequently prioritized
3.0-5.0
Moderate positioning
5.0+
Late mentions

Visibility Score

85+
Exceptional presence
70-85
Strong visibility
50-70
Room for improvement
<50
Needs attention

Contextual Analytics

Trend Analysis

Performance changes over time periods (daily, weekly, monthly)

Use Case:

Identify patterns, measure campaign impact, spot emerging issues

Key Insights:

  • Growth trajectories
  • Seasonal patterns
  • Campaign effectiveness

Platform Breakdown

Performance across different AI platforms (ChatGPT, Claude, Gemini, etc.)

Use Case:

Understand platform-specific strengths and weaknesses

Key Insights:

  • Platform preferences
  • Optimization opportunities
  • Audience differences

Query Context

Types of questions and contexts where your brand appears

Use Case:

Optimize content for high-value use cases

Key Insights:

  • Use case alignment
  • Content gaps
  • Positioning opportunities

Response Timing

How quickly mentions appear after content changes or events

Use Case:

Measure content optimization and PR impact speed

Key Insights:

  • Content freshness impact
  • News cycle effects
  • Update propagation

How to Interpret Your Data

Compare Relative Performance

Your metrics are most meaningful when compared to competitors in your industry

Example: A 20% mention rate might be excellent in a niche B2B market but average in consumer tech

Look for Trends, Not Just Numbers

Direction of change is often more important than absolute values

Example: A sentiment score improving from 6.5 to 7.2 over 3 months shows positive momentum

Consider External Factors

News events, product launches, and market changes affect metrics

Example: A temporary dip in sentiment after a product issue doesn't indicate long-term problems

Segment Your Analysis

Break down metrics by platform, time period, or query type for deeper insights

Example: You might have strong performance on ChatGPT but opportunities on newer platforms

Metrics-Driven Action Framework

Use this framework to turn your metrics into actionable improvements:

📊 Analyze

  • • Review all core metrics monthly
  • • Compare to previous periods
  • • Identify biggest gaps vs competitors

🎯 Prioritize

  • • Focus on lowest-performing metrics first
  • • Consider business impact of improvements
  • • Set realistic improvement targets

🚀 Optimize

  • • Implement content improvements
  • • Adjust monitoring strategies
  • • Test different approaches

📈 Measure

  • • Track changes after optimizations
  • • Document what works
  • • Iterate based on results