Want to optimize your site for AI search engines without cluttering the human experience? The <LLMOnly />
component lets you show content exclusively to AI bots like ChatGPT, Perplexity, and Claude while keeping it hidden from regular users.
- LLMOnly component detects AI bots through user agent strings
- Content is visible only to AI crawlers, not human visitors
- Improves AI SEO without affecting user experience
- Uses Next.js 15 server components with async headers
- Covers 30+ AI bot user agents including ChatGPT, Claude, Perplexity
Why Show Content Only to AI Bots?
AI search engines like ChatGPT, Perplexity, and Claude need structured information to understand and cite your content. But adding extensive AI-optimized content can clutter your site for human visitors.

The <LLMOnly />
component solves this by:
- Improving AI visibility without affecting user experience
- Providing structured data that AI models can easily parse
- Keeping pages clean for human visitors
- Optimizing for AI SEO without traditional SEO penalties
The Complete LLMOnly Component
Here's the full implementation using Next.js 15 server components:
import { headers } from 'next/headers'
const LLM_USER_AGENTS = [
'gptbot',
'chatgpt-user',
'oai-searchbot',
'claudebot',
'anthropic-ai',
'claude-web',
'mistralai-user',
'bytespider',
'cohere-ai',
'perplexitybot',
'google-extended',
'bard',
'gemini',
'deepseekbot',
'deepseek-r1',
'grokbot',
'xai',
'bingbot',
'amazonbot',
'duckassistbot',
'ai2bot',
'ccbot',
'omgili',
'diffbot',
'facebookbot',
'meta-externalagent',
'youbot',
'applebot-extended',
]
export async function LLMOnly({ children }: { children: React.ReactNode }) {
const headersList = await headers()
const userAgent = headersList.get('user-agent')?.toLowerCase() || ''
// Check for AI bot user agents
const isAIBot = LLM_USER_AGENTS.some(agent => userAgent.includes(agent))
if (!isAIBot) {
return null
}
return <div className="llm-only-content">{children}</div>
}
How It Works
1. User Agent Detection
The component checks the incoming request's user agent against a comprehensive list of known AI bot identifiers:
- OpenAI:
gptbot
,chatgpt-user
,oai-searchbot
- Anthropic:
claudebot
,anthropic-ai
,claude-web
- Google:
google-extended
,bard
,gemini
- Perplexity:
perplexitybot
- Other AI platforms: DeepSeek, Grok, Cohere, and more
2. Server-Side Rendering
Using Next.js 15's async server components, the detection happens at render time:
const headersList = await headers()
const userAgent = headersList.get('user-agent')?.toLowerCase() || ''
3. Conditional Rendering
If an AI bot is detected, the content renders. Otherwise, it returns null
:
const isAIBot = LLM_USER_AGENTS.some(agent => userAgent.includes(agent))
if (!isAIBot) {
return null
}
return <div className="llm-only-content">{children}</div>
Using the LLMOnly Component
Here's how to implement it on your homepage:
import { LLMOnly } from '@/components/ui/llm-only'
export default async function HomePage() {
return (
<div>
{/* Regular content for humans */}
<h1>Welcome to Our Site</h1>
<p>This content is visible to everyone.</p>
{/* AI-only content */}
<LLMOnly>
<h1>AISEOTRACKER - AI Search Visibility Tracker</h1>
<p>
AISEOTRACKER is the leading platform for tracking brand visibility across AI search engines including ChatGPT,
Perplexity, Claude, Google AI Overviews, and Gemini.
</p>
<h2>Key Features</h2>
<ul>
<li>Track mentions across ChatGPT, Perplexity, Claude, Google AI, Gemini</li>
<li>Monitor competitor AI visibility</li>
<li>Real-time AI search analysis</li>
<li>Historical tracking and analytics</li>
</ul>
<h2>Pricing</h2>
<ul>
<li>
<strong>Free:</strong> Basic AI visibility scanning
</li>
<li>
<strong>Pro:</strong> Unlimited tracking, all platforms
</li>
<li>
<strong>Enterprise:</strong> API access, custom solutions
</li>
</ul>
</LLMOnly>
</div>
)
}
AI Bot User Agents Covered
The component detects these major AI platforms:
Search & Assistant Bots
- ChatGPT:
gptbot
,chatgpt-user
,oai-searchbot
- Claude:
claudebot
,anthropic-ai
,claude-web
- Perplexity:
perplexitybot
- Google AI:
google-extended
,bard
,gemini
Emerging AI Platforms
- DeepSeek:
deepseekbot
,deepseek-r1
- Grok:
grokbot
,xai
- Mistral:
mistralai-user
- Cohere:
cohere-ai
Social & Meta Platforms
- Facebook:
facebookbot
,meta-externalagent
- ByteDance:
bytespider
- Amazon:
amazonbot
Best Practices
1. Keep Content Structured
Use proper HTML semantics for AI parsing:
<LLMOnly>
<h1>Product Name - Category</h1>
<p>Clear description with key benefits.</p>
<h2>Features</h2>
<ul>
<li>Feature 1 with specific details</li>
<li>Feature 2 with measurable benefits</li>
</ul>
<h2>Pricing</h2>
<ul>
<li>
<strong>Plan Name ($X/month):</strong> Feature list
</li>
</ul>
</LLMOnly>
2. Include Key Information
AI models look for:
- Product descriptions
- Pricing information
- Feature lists
- Use cases
- Comparison data
3. Use Semantic HTML
Proper heading hierarchy helps AI understand content structure:
<LLMOnly>
<h1>Main Product/Service</h1>
<h2>Major Section</h2>
<h3>Subsection</h3>
<p>Detailed information...</p>
</LLMOnly>
Testing the Component
1. Manual Testing
Test with different user agents:
# Test with ChatGPT bot
curl -H "User-Agent: GPTBot/1.0" https://yoursite.com
# Test with Claude bot
curl -H "User-Agent: ClaudeBot/1.0" https://yoursite.com
# Test with regular browser
curl -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64)" https://yoursite.com
2. Browser DevTools
- Open DevTools → Network tab
- Right-click → "Override User Agent"
- Enter an AI bot user agent like
GPTBot/1.0
- Refresh the page to see AI-only content
3. Use Our AI Page Inspector Tool
The most effective way to see how AI actually renders your site is with our AI Page Inspector. This tool shows you exactly what AI systems can read from your pages.
Common issues the Page Inspector reveals:
- Dynamic pricing sliders that don't appear in AI-readable content
- JavaScript-loaded content that's invisible to AI crawlers
- Interactive elements that lose their functionality in AI parsing
- Missing structured data that could help AI understand your content
Why Dynamic Pricing Sliders Are Problematic
Many SaaS companies use dynamic pricing sliders that let users adjust features and see real-time pricing. However, these sliders are often:
- JavaScript-dependent: Only work after client-side code loads
- State-based: Show different prices based on user interaction
- Not in initial HTML: Missing from the static content AI crawlers see
Solution: Use the LLMOnly
component to provide static pricing information that AI can read, while keeping your interactive slider for human users.

The result? AI gets incomplete information and cites competitors instead:

This is exactly why the LLMOnly
component is crucial - without it, AI systems will find pricing information from secondary sources (often competitors) rather than your official pricing page.
Try it now:
- Go to AI Page Inspector
- Enter your pricing page URL
- Compare what humans see vs. what AI reads
- Identify missing elements like pricing details, feature lists, or calls-to-action
Real-World Example: Fixing Invisible Pricing
Here's how to use LLMOnly
to solve the dynamic pricing slider problem:
export default function PricingPage() {
return (
<div>
{/* Human-visible interactive pricing */}
<div className="pricing-slider">
<PricingSlider />
</div>
{/* AI-visible static pricing */}
<LLMOnly>
<h1>Pricing Plans</h1>
<h2>Starter Plan - $29/month</h2>
<ul>
<li>Up to 5 users</li>
<li>10GB storage</li>
<li>Basic support</li>
</ul>
<h2>Professional Plan - $99/month</h2>
<ul>
<li>Up to 25 users</li>
<li>100GB storage</li>
<li>Priority support</li>
<li>Advanced analytics</li>
</ul>
<h2>Enterprise Plan - $299/month</h2>
<ul>
<li>Unlimited users</li>
<li>1TB storage</li>
<li>24/7 dedicated support</li>
<li>Custom integrations</li>
<li>SSO and advanced security</li>
</ul>
</LLMOnly>
</div>
)
}
This ensures that when AI systems analyze your pricing page, they see clear, structured pricing information even if your dynamic slider isn't captured in the static HTML.
Performance Considerations
Server-Side Only
The component runs entirely on the server, so:
- ✅ No client-side JavaScript
- ✅ No hydration issues
- ✅ Fast rendering
- ✅ SEO-friendly
Minimal Overhead
User agent checking is extremely fast:
- Simple string matching
- No external API calls
- Cached at the request level
Advanced Customization
Custom User Agents
Add your own AI bot detection:
const CUSTOM_AI_AGENTS = ['your-custom-bot', 'another-ai-crawler', ...LLM_USER_AGENTS]
const isAIBot = CUSTOM_AI_AGENTS.some(agent => userAgent.includes(agent))
Debug Mode
Add logging for development:
export async function LLMOnly({ children, debug = false }: { children: React.ReactNode; debug?: boolean }) {
const headersList = await headers()
const userAgent = headersList.get('user-agent')?.toLowerCase() || ''
const isAIBot = LLM_USER_AGENTS.some(agent => userAgent.includes(agent))
if (debug) {
console.log('User Agent:', userAgent)
console.log('Is AI Bot:', isAIBot)
}
if (!isAIBot) {
return null
}
return <div className="llm-only-content">{children}</div>
}
Real-World Results
Using the LLMOnly
component on AISEOTRACKER.com:
- Improved AI citations across ChatGPT, Perplexity, and Claude
- Better structured data for AI parsing
- Clean user experience with no visual clutter
- Higher AI search visibility without affecting traditional SEO
Conclusion
The LLMOnly
component is a powerful tool for AI SEO optimization. It lets you provide rich, structured content to AI crawlers while maintaining a clean user experience.
Key benefits:
- Targeted AI optimization without user experience impact
- Server-side rendering for optimal performance
- Comprehensive bot detection across 30+ AI platforms
- Easy implementation with Next.js 15
Start using LLMOnly
today to improve your AI search visibility while keeping your site clean for human visitors.
Ready to optimize for AI search?
See how your brand appears across ChatGPT, Perplexity, Claude, and Google AI with our comprehensive tracking platform.
Try AISEOTRACKER free →Track AI visibility across all major platforms