Why Your Site Isn't Getting Indexed
Overview
You have launched your site, waited a few days, and searched Google -- but your pages are nowhere to be found. This is frustrating, and it is one of the most common questions we hear. The good news: the cause is almost always fixable. This guide walks through the five most likely reasons your site is not getting indexed.
Quick Diagnosis
Start by checking whether Google can even reach your pages:
# Check if your sitemap is accessible
curl -s -o /dev/null -w "%{http_code}" https://yourdomain.com/sitemap.xml
# Check if robots.txt is blocking crawlers
curl -s https://yourdomain.com/robots.txt
# Check what bots actually see
curl -s -A "Googlebot/2.1" "https://yourdomain.com/" | head -30
Work through each result and compare it to the sections below.
Common Causes
1. Sitemap Missing or Returning 404
Symptom: Visiting https://yourdomain.com/sitemap.xml in your browser returns a 404 error, a blank page, or invalid XML.
Why it matters: A sitemap tells Google which pages exist and when they were last updated. Without one, Google relies entirely on following links to discover your pages, which is slower and less reliable.
Verification:
curl -s https://yourdomain.com/sitemap.xml | head -10
Expected: Valid XML starting with <?xml version="1.0" and containing <urlset> or <sitemapindex> tags.
Fix:
- If your site is on RndrKit, check the Sitemaps guide to enable automatic sitemap generation.
- If your framework generates a sitemap (Next.js, Gatsby, etc.), verify it is configured and deployed.
- If you need to create one manually, place a
sitemap.xmlfile in your/publicdirectory with URLs for every page. - Submit your sitemap to Google Search Console at Sitemaps > Add a new sitemap.
2. robots.txt Blocking Crawlers
Symptom: Your robots.txt contains Disallow: / under User-agent: *, telling all search engines to stay away from your entire site.
Verification:
curl -s https://yourdomain.com/robots.txt
What to look for:
# Bad -- blocks everything
User-agent: *
Disallow: /
# Good -- allows everything
User-agent: *
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml
Fix:
- Check the Robots.txt guide for how to configure robots.txt through the RndrKit dashboard.
- If your framework generates a
robots.txtautomatically, check its configuration. Some development modes or staging environments default to blocking all crawlers. - Remove or modify the
Disallow: /directive. You can still block specific paths (like/adminor/api) while allowing the rest of your site.
3. No Links Pointing to Your Pages
Symptom: Your sitemap exists and robots.txt is fine, but specific pages are not getting indexed. These pages are "orphaned" -- no other page links to them.
Why it matters: Google discovers pages by following links. If a page has no incoming links from your site's navigation, other pages, or external sites, Google may never find it -- even if it is in your sitemap.
Fix:
- Add navigation links. Every important page should be reachable from your main navigation or footer.
- Add internal links. Link to related pages from within your content. Blog posts should link to relevant product pages and vice versa.
- Include all pages in your sitemap. This acts as a backup discovery mechanism.
- Submit individual URLs in Google Search Console. Use URL Inspection > Request Indexing for pages you need indexed urgently.
4. Your Site Is Too New
Symptom: Everything is configured correctly -- sitemap works, robots.txt allows crawling, links exist -- but Google still has not indexed your pages.
Why it matters: Google does not index sites instantly. For new domains, it can take anywhere from a few days to several weeks for pages to appear in search results.
Fix:
- Submit your sitemap to Google Search Console. This is the single most effective action for new sites.
- Request indexing for your homepage. In Google Search Console, use URL Inspection, enter your homepage URL, and click Request Indexing.
- Share your site on social media. Links from Twitter, Facebook, LinkedIn, and other platforms help Google discover your site faster.
- Get listed in directories. Industry-specific directories, Google Business Profile (for local businesses), and other trusted sites provide valuable discovery signals.
- Be patient. Once Google has your sitemap, it will crawl on its own schedule. Repeatedly requesting indexing does not speed things up.
5. JavaScript Not Being Pre-rendered
Symptom: Your site is a single-page application (React, Vue, Angular, Lovable, Bolt, etc.) and Google sees an empty shell instead of your content.
This is the core problem RndrKit solves.
Verification:
# See what Googlebot sees
curl -s -A "Googlebot/2.1" "https://yourdomain.com/" | head -30
Signs of a problem:
<!-- This is what bots see without pre-rendering -->
<!DOCTYPE html>
<html>
<head><title>My App</title></head>
<body>
<div id="root"></div>
<script src="/assets/index.js"></script>
</body>
</html>
The <div id="root"></div> is an empty SPA shell. There is no content for Google to index -- no headings, no text, no links. Google's JavaScript rendering service can sometimes execute your JavaScript, but it is slow, unreliable, and deprioritized.
Signs it is working correctly (with RndrKit):
<!-- This is what bots see with pre-rendering -->
<!DOCTYPE html>
<html>
<head>
<title>My App - Homepage</title>
<meta name="description" content="Your actual page description" />
</head>
<body>
<div id="root">
<header>...</header>
<main>
<h1>Welcome to My App</h1>
<p>Your actual page content appears here...</p>
</main>
</div>
</body>
</html>
Fix:
- Set up RndrKit for your domain.
- Once configured, verify bot rendering is working:
# Should return fully rendered HTML with your page content
curl -s -A "Googlebot/2.1" "https://yourdomain.com/" | grep -c "<h1>"
- If the count is 0, check Rendering Failures for debugging steps.
After Fixing
Once you have addressed the issue:
- Re-submit your sitemap in Google Search Console.
- Request indexing for your most important pages.
- Monitor the Coverage report in Google Search Console over the next 1-2 weeks to confirm pages are being picked up.
Next Steps
- Rendering Failures -- Debug pre-rendering issues
- Sitemaps -- Configure your sitemap
- Robots.txt -- Manage crawler access
- SEO Audit -- Run a full SEO check on your site