APolyAEO
·8 min read

I Tracked a SaaS Company's Website for a Week. They're Already Doing AEO.


Last week, I was monitoring a mid-market B2B SaaS company — a cloud desktop provider serving small-to-medium businesses — when I noticed something unusual.

Between February 20 and February 27, they shipped 3 deployments, published 6 blog posts, completely restructured their pricing page, and quietly rolled out something that fewer than 1% of websites have done: they implemented llms.txt.

If you don't know what that is, you're already behind.

What Is llms.txt and Why Should You Care?

llms.txt is a new web standard — think of it as robots.txt for AI. While robots.txt tells search engine crawlers what to index, llms.txt tells large language models how to understand your site.

The format is simple: a structured markdown file at your domain root listing every important page with a one-line description. This company's file was 80KB, covering 330 pages across 9 sections — Pages, Solutions, Blog Posts, FAQs, Applications, Tutorials, Testimonials, Glossary, and Product Updates.

They also created:

Why this matters: When ChatGPT, Claude, Gemini, or Perplexity answer a query about cloud desktops, they now have a structured map of this company's entire knowledge base. It's not a ranking signal yet — it's a discoverability signal. The AI equivalent of submitting your sitemap to Google Search Console in 2005.

The robots.txt Tells the Real Story

Their robots.txt got a complete overhaul. The old version was 7 lines — standard WordPress boilerplate. The new version is a strategic document:

User-agent: *
Content-Signal: search=yes, ai-input=yes, ai-train=no

User-agent: ClaudeBot
Disallow: /

User-agent: GPTBot
Disallow: /

User-agent: Google-Extended
Disallow: /

Read that carefully. They're blocking every major AI training bot (ClaudeBot, GPTBot, Google-Extended, Bytespider, CCBot, Amazonbot, meta-externalagent) — but they explicitly set ai-input=yes.

Translation: "You can use our content for RAG and real-time answers, but you cannot train your models on it."

This is the Content-Signal directive, a new Cloudflare-managed standard that includes legal language referencing EU Directive 2019/790 on copyright. It's the difference between "cite me in answers" and "absorb me into your weights."

This is the most sophisticated AI content strategy I've seen from a company this size.

The Content Blitz: 6 Posts in 5 Days

Between February 20 and 25, they published 6 blog posts. But the topic selection tells you everything:

| Date | Post | Strategy | |------|------|----------| | Feb 20 | IT Strategy for Business | Broad top-funnel | | Feb 24 | Cybersecurity Posture | Trust/authority | | Feb 25 | Mobile Device Management | Adjacent keywords | | Feb 25 | Why VMs Are Ideal for AI Agents | Product-AI positioning | | Feb 25 | Software Reseller Guide | Channel partner content | | Feb 25 | End User Support | Operational keywords |

That AI agents post is the tell. At 3,700 words, it positions their core product (virtual machines) as the infrastructure layer for AI agent deployment. The headings read like a product roadmap disguised as thought leadership:

They're not just doing SEO. They're positioning for the AI-adjacent search queries that will define their market in 12 months.

The Pricing Page Restructure

Before (February 5): Named business tiers — "The Founder", "The Startup", "The SMB", "The Firm", "The Enterprise". No visible per-user pricing. "Contact Us" for costs.

After (February 25+): Transparent per-user-type pricing — Basic ($35/mo), Essential ($40/mo), Mid Level ($60/mo), Heavy ($110/mo), Power ($240+/mo). Each tier maps to specific job roles and use cases.

The old pricing page? Still live at a separate URL — not even noindexed. They shipped fast.

Why this matters for AEO: When an AI answers "how much does cloud desktop cost for 10 developers?", it needs structured, extractable pricing data. Opaque "Contact Us" pricing is invisible to AI answers. Transparent per-user pricing with role descriptions is exactly what LLMs can parse and recommend.

They restructured their pricing to be AI-readable.

What They Got Right (And What They Missed)

What's working:

  1. llms.txt + llms-full.txt + LLM sitemap — First-mover advantage. When everyone does this in 2 years, the early adopters will already have established LLM mindshare.

  2. Content-Signal in robots.txt — Blocking training but enabling retrieval is the correct play. You want AI to cite you, not to learn from you and then forget the source.

  3. AI-adjacent content — Writing about AI agents running on VMs plants the seed for future AI-generated recommendations.

  4. Pricing transparency — Structured, role-based pricing that AI can parse and recommend.

  5. Publishing velocity — 24 posts in February. They're treating content as infrastructure.

What's missing:

The Playbook: What You Should Steal

If you're running a SaaS company in 2026, here's what to implement this week:

1. Create your llms.txt (30 minutes)

# Your Company Name

## Pages
- [Pricing](https://yoursite.com/pricing): Your pricing plans
- [Features](https://yoursite.com/features): Product features

## Blog Posts
- [Post Title](https://yoursite.com/blog/post): Summary

## FAQs
- [FAQ Topic](https://yoursite.com/faq): Summary

Drop it at your domain root. Create an llms-full.txt with expanded content. Add an llms-sitemap.xml pointing to both.

2. Set up Content-Signal in robots.txt (5 minutes)

User-agent: *
Content-Signal: search=yes, ai-input=yes, ai-train=no

Let AI use your content for answers. Block it from training. This isn't just strategy — it's IP protection.

3. Make your pricing AI-readable (1 day)

If your pricing says "Contact Us" or hides behind a demo request, AI will never recommend you. Structure with:

4. Add proper schema markup (2 hours)

At minimum: Product or SoftwareApplication on product pages, FAQPage on FAQ content, BreadcrumbList on all pages, correct author markup with real names.

5. Write one AI-positioning post (half day)

Whatever your product does, write about how it intersects with AI workflows. You're not writing for today's SEO rankings. You're writing for the RAG sources that power next year's AI answers.

The Bigger Picture

SEO is 25 years old. We optimized for PageRank, then Panda, then RankBrain, then BERT. Each shift rewarded the companies that moved first.

AEO (AI Engine Optimization) is in its "submit your sitemap to Google" era. The signals are still forming. The standards are still draft. Most companies haven't heard of llms.txt or Content-Signal.

The company I tracked this week isn't a tech giant. They're a mid-market SaaS with 270 blog posts and a WordPress site. But they've already implemented llms.txt, configured Content-Signal directives, restructured their pricing for AI readability, and started positioning their product for AI-adjacent queries.

They're not waiting for AEO to become standard. They're building the foundation while everyone else is still arguing about whether AI will replace Google.

By the time the rest of the market catches up, the early movers will already be the default answer.


Want to get ahead of the AEO curve? Start with llms.txt. It takes 30 minutes and puts you ahead of 99% of websites.