BotRender Best Practices
Optimize your BotRender integration for maximum SEO performance, faster indexing, and better search engine visibility.
Worker-First (Recommended)
Serve cached HTML to bots; on cache miss, return 404 quickly and queue a render. Optionally fall back to SPA so first crawl still gets content.
- Consistent sub-200ms responses to crawlers
- Avoids crawler timeouts and crawl-budget penalties
- Stable under concurrency spikes and third-party latency
- Use recache/pre-warm APIs for fresh content
Real-Time Rendering (Optional)
Synchronously render on bot requests. Higher latency and risk of timeouts; use with caution.
- Expect seconds of latency during cold starts/heavy pages
- Burst bot traffic can overload origin/middleware
- Restrict to a few critical routes and set strict timeouts
- Prefer pre-warming over synchronous rendering when possible
404 Not Found Pages
Return proper 404 status for missing pages
<meta name="prerender-status-code" content="404">
301 Redirects
Handle permanent redirects correctly
<meta name="prerender-status-code" content="301"> <meta name="prerender-header" content="Location: https://new-url.com">
Dynamic Status Codes
Set status codes programmatically based on content
// JavaScript example if (pageNotFound) { document.head.innerHTML += '<meta name="prerender-status-code" content="404">'; }
Prerender Ready Flag
Control when BotRender considers the page fully loaded
<script> // Initially set to false window.prerenderReady = false; // Set to true when all content is loaded Promise.all([ loadCriticalData(), loadAsyncComponents() ]).then(() => { window.prerenderReady = true; }); </script>
Wait for API Calls
Delay rendering until critical API requests complete
async function waitForData() { const data = await fetch('/api/critical-data'); const result = await data.json(); // Render content with data renderComponent(result); // Signal ready for prerendering window.prerenderReady = true; }
Strategic Cache Expiration
Set appropriate cache times based on content update frequency
- Static content: 24-48 hours
- Dynamic content: 1-6 hours
- Real-time content: 15-30 minutes
- Emergency updates: Use API recache
API-Based Recaching
Trigger recaching when content changes
// Recache when content updates const recacheUrl = async (url) => { await fetch('https://api.botrender.com/recache', { method: 'POST', headers: { 'Authorization': 'Bearer YOUR_API_TOKEN', 'Content-Type': 'application/json' }, body: JSON.stringify({ url }) }); }; // Trigger after content update await updateArticle(articleId); await recacheUrl(`/articles/${articleId}`);
Complete User Agent List
Include all major search engines, social platforms, and AI crawlers
- Search engines: Google, Bing, Yahoo, DuckDuckGo, Baidu, Yandex
- Social platforms: Facebook, Twitter, LinkedIn, Pinterest
- AI platforms: ChatGPT, Claude, Perplexity, Bard
- SEO tools: Ahrefs, SEMrush, Moz
Future-Proof Detection
Use patterns that catch new bot variants
const isBot = (userAgent) => { const botPattern = /(googlebot|bingbot|slurp|duckduckbot|baiduspider|yandexbot|facebookexternalhit|twitterbot|linkedinbot|pinterestbot|chatgpt-user|gptbot|claude-web|perplexitybot|ahrefsbot|semrushbot)/i; return botPattern.test(userAgent); };
Critical Resource Loading
Prioritize loading of essential content
- Load above-the-fold content first
- Defer non-critical JavaScript
- Optimize images with lazy loading
- Minimize render-blocking resources
JavaScript Optimization
Reduce JavaScript execution time for faster rendering
// Optimize heavy computations const optimizeRendering = () => { // Use requestIdleCallback for non-critical work requestIdleCallback(() => { performHeavyCalculations(); }); // Critical rendering path renderEssentialContent(); window.prerenderReady = true; };
Disable for Crawlers
Hide cookie banners from search engine bots
// Detect if request is from a crawler const isCrawler = () => { const userAgent = navigator.userAgent || ''; return /bot|crawler|spider|crawling/i.test(userAgent); }; // Only show cookie banner to real users if (!isCrawler()) { showCookieBanner(); }
Server-Side Detection
Handle cookie banners at the server level
// Express.js example app.use((req, res, next) => { const isBot = /bot|crawler|spider/i.test(req.get('User-Agent')); res.locals.showCookieBanner = !isBot; next(); });
Dynamic Schema Generation
Generate schema markup based on page content
const generateSchema = (pageData) => { const schema = { "@context": "https://schema.org", "@type": "Article", "headline": pageData.title, "description": pageData.description, "author": pageData.author, "datePublished": pageData.publishDate }; const script = document.createElement('script'); script.type = 'application/ld+json'; script.textContent = JSON.stringify(schema); document.head.appendChild(script); };
Schema Validation
Ensure schema markup is valid before rendering
- Test with Google's Rich Results Test
- Validate JSON-LD syntax
- Include all required properties
- Use specific schema types when available
✅ Essential Setup
- Configure proper status codes for 404s and redirects
- Implement window.prerenderReady for dynamic content
- Set up comprehensive bot detection patterns
- Optimize cache expiration times for your content
🚀 Advanced Optimization
- Hide cookie banners from search engine crawlers
- Implement API-based recaching for content updates
- Generate dynamic structured data/schema markup
- Monitor and optimize JavaScript execution time
Ready to Optimize Your Integration?
Start implementing these best practices today and see immediate improvements in your SEO performance and search engine visibility.