Core Concepts
Learn the key terminology and data model behind AEO Optima — from organizations and projects to snapshots, visibility metrics, and AEO scoring.
Key Terminology
The table below defines the core terms used throughout AEO Optima.
| Term | Definition |
|---|---|
| Organization | A team or company workspace. All projects, team members, API keys, and billing are scoped to an organization. You can belong to multiple organizations. |
| Project | A brand, product, or business unit being tracked. Each project has its own set of prompts, LLM configurations, snapshots, and analytics. An organization can contain multiple projects. |
| Prompt | A question or query sent to AI models during a snapshot. Prompts should reflect the real questions your target audience asks AI chatbots (e.g., "What is the best CRM for startups?"). |
| Snapshot | A point-in-time capture of AI responses to your prompts. Each snapshot records the full text response from every configured AI model for every active prompt. Snapshots are the foundation of all analytics. |
| Mention Rate | The percentage of snapshot responses that mention your brand. A mention rate of 60% means your brand appeared in 6 out of 10 AI responses. Previously labeled "Brand Visibility" in the UI. |
| Visibility Score | A composite 0–100 score that weights mention rate, rank position, sentiment, and citation strength into a single quality measure. Use this to track overall AI presence; use Mention Rate to see raw frequency. |
| Rank Position | Where your brand appears in ordered or numbered lists within an AI response. Position 1 means your brand was mentioned first; higher numbers mean lower placement. Not all responses contain ranked lists. |
| Sentiment | The tone of the AI's mention of your brand — classified as positive, neutral, or negative. Sentiment analysis helps detect reputation risks and track perception changes over time. |
| BYOK | Bring Your Own Key — the ability to connect your own LLM provider API keys (via OpenRouter gateway or direct provider keys) to AEO Optima. This gives you full control over API usage and costs. Alternatively, you can use the platform's managed API access. |
| AEO Score | A score from 0 to 100 that evaluates how well a specific web page is optimized for AI citation. Higher scores indicate better structure, clarity, and authority signals that make the page more likely to be referenced by AI models. |
| LLM Configuration | The set of AI models enabled for a project. Models are loaded dynamically from the platform's model registry, which is synchronized daily with the latest releases from all providers. |
| Share of Voice | A competitive metric showing what percentage of AI mentions in your category go to your brand versus competitors. |
| Model Registry | A centralized, automatically-updated catalog of all available AI models across providers. The registry syncs daily so new models become available without manual configuration. |
| Usage Record | A record of API usage per snapshot, tracking prompt tokens, completion tokens, and calculated costs. Every capture generates usage records for cost transparency. |
| Prompt Segment | A classification for each prompt: Branded (mentions your brand), Non-Branded (organic discovery queries), or Competitor (brand vs competitor comparisons). Segments let you filter analytics to see true organic visibility separately from branded queries. |
Data Hierarchy
AEO Optima organizes data in a clear hierarchy. Understanding this structure helps you navigate the platform effectively.
How It Works
-
Organization is your top-level workspace. It contains your team members, API keys, and one or more projects.
-
Project represents a single brand or product you want to monitor. Each project is independent — it has its own prompts, model configuration, schedules, and snapshot history.
-
Prompts are the questions your project sends to AI models. You define these based on what your target audience might ask an AI chatbot about your industry, category, or brand.
-
LLM Configuration determines which AI models are queried. Models are selected from the platform's dynamic model registry. You can enable or disable models per project and optionally connect your own API keys.
-
Schedules automate snapshot captures at regular intervals (hourly, daily, weekly, biweekly, or monthly). Each scheduled run produces a complete set of snapshots across all active prompts and models.
-
Snapshots are generated when AEO Optima sends your prompts to the configured AI models and records their responses. Every snapshot produces visibility scores, rank positions, sentiment data, and usage records with token counts and costs.
The Project Selector
The project selector in the sidebar is the primary navigation control. All data displayed in the dashboard, analytics, snapshots, and prompts views is filtered to the currently selected project.
When you switch projects, all metrics, charts, and tables update to reflect that project's data. If you manage multiple brands within one organization, you use the project selector to move between them.
Tip: If your dashboard appears empty, check the project selector to make sure you have the correct project selected.
Snapshot Lifecycle
Understanding the snapshot lifecycle helps you interpret your data:
- Trigger — A snapshot is initiated either manually (by clicking "New Snapshot") or automatically via a scheduled job.
- Key Resolution — AEO Optima resolves which API keys to use for each model via a 3-tier system: direct BYOK keys first, then gateway BYOK key, then platform-managed API.
- Querying — Each active prompt is sent to each enabled AI model. Responses are collected in parallel.
- Analysis — Each response is analyzed for brand mentions, rank position, sentiment, competitor mentions, citations, and improvement suggestions.
- Cost Tracking — Token usage (prompt + completion) is recorded and costs are calculated based on model-specific pricing from the model registry.
- Storage — The raw responses, computed metrics, and usage records are stored for historical comparison.
- Visualization — Results appear on the dashboard and in the analytics views, contributing to trend lines and aggregate metrics.
Snapshots are immutable once captured. They serve as a historical record of how AI engines responded at that specific moment in time.
Metrics at a Glance
| Metric | What It Tells You | Range |
|---|---|---|
| Mention Rate | How often AI mentions your brand | 0% – 100% |
| Visibility Score | Composite quality score across mention rate, rank, sentiment, citations | 0 – 100 |
| Rank Position | Where you appear in AI recommendations | 1 (best) – 10+ |
| Sentiment | How AI perceives your brand | Positive / Neutral / Negative |
| AEO Score | How AI-ready a web page is | 0 – 100 |
| Share of Voice | Your brand's share of AI mentions vs. competitors | 0% – 100% |
Next Steps
- Roles & Permissions — Understand team access levels
- Quick Start Guide — Set up your first project
- Dashboard — Explore your visibility metrics