AEO Optima Docs
Features

Analytics & Trends

Track your brand's AI visibility over time with charts, comparisons, and performance breakdowns.

Overview

The Analytics and Trends pages transform your snapshot data into visual insights. While individual snapshots tell you what happened in a single moment, Analytics and Trends show you how your AI visibility is changing over days, weeks, and months.

AEO Optima provides two complementary views: the Analytics page for detailed breakdowns and the Trends page for high-level performance tracking.

Analytics Page

The Analytics page provides three primary visualizations that help you understand the shape of your AI visibility data.

Visibility Trend

A line chart that plots your brand visibility percentage over time. Each data point represents the percentage of snapshots that mentioned your brand on a given day.

Use this chart to:

  • Identify upward or downward trends in brand mention frequency.
  • Spot sudden drops that may indicate changes in AI model behavior.
  • Correlate visibility changes with your content publishing or SEO activity.

LLM Comparison

A bar chart that compares your visibility across different AI models. Each bar represents a different model (such as ChatGPT, Claude, Gemini, or Perplexity), showing the percentage of snapshots from that model that mentioned your brand.

Use this chart to:

  • Identify which AI engines mention your brand most and least frequently.
  • Prioritize optimization efforts toward models where visibility is lowest.
  • Understand model-specific differences in how your brand is represented.

Sentiment Distribution

A pie chart showing the breakdown of sentiment across all analyzed snapshots — positive, neutral, and negative.

Use this chart to:

  • Get a quick read on overall AI perception of your brand.
  • Track whether the sentiment balance is shifting over time.
  • Identify when negative sentiment spikes and investigate the cause.

Intelligence Scores

The Analytics page surfaces six proprietary Intelligence Scores — composite metrics that distill your snapshot data into actionable numbers. Each score is displayed as a KPI card with a 0-100 value and trend indicator.

ScoreFull NameWhat It Measures
BNCIBrand Narrative Control IndexHow consistently AI engines convey your intended brand narrative
CMCSCross-Model Consistency ScoreWhether different AI models describe your brand the same way
MEIModel Equity IndexYour brand's relative strength across the AI model ecosystem
SDISentiment Drift IndexHow stable or volatile AI sentiment toward your brand is over time
CIPSCompetitive Intelligence Positioning ScoreYour brand's positioning relative to competitors in AI responses
ETASEntity Trust & Authority ScoreHow much trust and authority AI models attribute to your brand as a knowledge entity

These scores update automatically as new snapshots are captured. Together, they provide a comprehensive view of your AI brand health that goes beyond simple visibility percentages.

Tip: Use CMCS to identify models that describe your brand inconsistently, then investigate the specific prompts where discrepancies appear. A low CMCS often indicates opportunities to improve your content for specific AI engines.

For a deep dive into each score's methodology and how to improve them, see the dedicated Intelligence Scores page.

The Trends page takes a higher-level view, designed for quick weekly reviews and executive-level reporting.

KPI Cards

Four key performance indicators are displayed at the top of the page:

KPIDescription
Total SnapshotsThe total number of snapshots captured within the selected date range
Avg VisibilityThe average brand visibility percentage across all snapshots in the range
Avg RankThe average rank position where your brand appeared in ordered lists
Trend DirectionAn indicator showing whether visibility is trending up, down, or stable compared to the prior period

Visibility Area Chart

A filled area chart showing brand visibility over the selected time range. The area fill makes it easy to see the overall trajectory at a glance, while the line edge shows daily fluctuations.

LLM Performance Bars

Similar to the Analytics page's LLM Comparison, this visualization shows per-model performance as horizontal bars. The Trends page version is optimized for quick scanning and comparison.

Top 5 and Bottom 5 Prompts

Two ranked lists that highlight your best-performing and worst-performing prompts within the selected date range:

  • Top 5 Prompts: The prompts with the highest brand visibility. These represent your strongest areas of AI presence.
  • Bottom 5 Prompts: The prompts with the lowest brand visibility. These represent your biggest opportunities for improvement.

Tip: Check the Bottom 5 Prompts list weekly. These are your biggest improvement opportunities. Focus your content strategy on the topics where AI models are not yet mentioning your brand.

Segment Filter

Both the Analytics and Trends pages include a Segment Toggle that lets you filter all data by prompt type:

SegmentWhat It Shows
AllData from all prompts (default, same as before)
BrandedOnly prompts that mention your brand name
Non-BrandedOnly organic discovery prompts (no brand or competitor mentions)
CompetitorOnly prompts that compare your brand to competitors

The Non-Branded view is particularly valuable because it shows your true organic AI visibility — how often AI models recommend you when users aren't specifically asking about you. The Competitor view reveals how you perform in direct comparison queries.

The segment toggle is URL-synced (?segment=branded), so filtered views are bookmarkable and can be shared with team members.

Tip: Compare your Branded visibility (should be high) against your Non-Branded visibility (the real signal). A large gap means AI models know your brand but don't organically recommend it — this is your biggest opportunity for improvement.

Date Range Filters

Both the Analytics and Trends pages support date range filtering to focus your analysis on the time period that matters most.

RangeBest For
7 daysDaily monitoring and spotting recent changes. Ideal for checking the impact of content published this week.
30 daysWeekly and monthly reviews. Provides enough data to see meaningful trends without noise.
90 daysQuarterly analysis and strategic planning. Shows long-term trajectory and seasonal patterns.

Select a date range using the filter controls at the top of either page. All charts and KPIs update immediately to reflect the selected period.

How to Use Analytics Effectively

Weekly Review Workflow

  1. Open the Trends page with a 7-day filter.
  2. Check the Trend Direction KPI — is visibility moving up or down?
  3. Review the Bottom 5 Prompts — are there any new entries since last week?
  4. Switch to the Analytics page for deeper investigation if you see unexpected changes.

Monthly Review Workflow

  1. Open the Trends page with a 30-day filter.
  2. Compare Avg Visibility and Avg Rank to the prior month's values.
  3. Check the LLM Performance Bars — has any model's behavior changed significantly?
  4. Review the Sentiment Distribution on the Analytics page — is the positive/negative balance stable?

After a Content Campaign

  1. Set the date range to cover the period before and after your content was published.
  2. Watch the Visibility Trend line chart for inflection points.
  3. Check the LLM Comparison to see if the new content improved visibility across all models or just specific ones.

Analytics and Other Features

FeatureConnection to Analytics
DashboardDashboard metrics are a real-time summary; Analytics provides the historical detail
SnapshotsEvery data point in Analytics comes from an individual snapshot
Sentiment AnalysisThe Sentiment Distribution chart connects to the deeper sentiment breakdowns
PromptsTop 5 and Bottom 5 lists link directly to prompt performance
Intelligence ScoresThe six KPI cards on Analytics are summarized here; the dedicated page provides methodology and improvement guidance