Dynamic Sitemaps with Supabase
Overview
If your site has content stored in a database — blog posts, product listings, directory entries — you need a dynamic sitemap that updates automatically as content is added or removed. A static sitemap file in /public will not cut it.
This guide walks you through building a dynamic sitemap with Supabase Edge Functions and serving it through RndrKit's redirect rules.
When You Need a Dynamic Sitemap
Static sitemap is fine if:
- Your site has a fixed set of pages (homepage, about, contact, services)
- Pages rarely change
- You can manually update
sitemap.xmlwhen routes change
Dynamic sitemap is needed if:
- You have a blog with posts stored in Supabase
- You have a product catalog or directory
- Content is created by users or pulled from an API
- Pages are added or removed frequently
Static Approach: Manual Sitemap
If you only have a handful of pages, a static sitemap.xml in your /public folder works fine:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://www.example.com/</loc>
<lastmod>2026-02-01</lastmod>
<changefreq>weekly</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://www.example.com/about</loc>
<lastmod>2026-01-15</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
<url>
<loc>https://www.example.com/contact</loc>
<lastmod>2026-01-15</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority>
</url>
</urlset>
Save this as /public/sitemap.xml and you are done. Skip ahead to Testing and Submission to submit it to Google.
Dynamic Approach: Supabase Edge Function
For database-driven content, you can create a Supabase Edge Function that queries your tables and generates the sitemap XML on the fly.
Step 1: Create the Edge Function
In your Supabase project, create a new Edge Function:
supabase functions new sitemap
Step 2: Write the Function
Replace the contents of supabase/functions/sitemap/index.ts with:
import { createClient } from "https://esm.sh/@supabase/supabase-js@2";
const supabaseUrl = Deno.env.get("SUPABASE_URL")!;
const supabaseKey = Deno.env.get("SUPABASE_SERVICE_ROLE_KEY")!;
const siteUrl = "https://www.example.com";
Deno.serve(async () => {
const supabase = createClient(supabaseUrl, supabaseKey);
// Fetch published blog posts
const { data: posts } = await supabase
.from("posts")
.select("slug, updated_at")
.eq("published", true)
.order("updated_at", { ascending: false });
// Fetch active products
const { data: products } = await supabase
.from("products")
.select("slug, updated_at")
.eq("active", true)
.order("updated_at", { ascending: false });
// Static pages
const staticPages = [
{ loc: "/", priority: "1.0", changefreq: "weekly" },
{ loc: "/about", priority: "0.8", changefreq: "monthly" },
{ loc: "/contact", priority: "0.5", changefreq: "monthly" },
{ loc: "/blog", priority: "0.9", changefreq: "daily" },
{ loc: "/products", priority: "0.9", changefreq: "daily" },
];
let xml = `<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">`;
// Add static pages
for (const page of staticPages) {
xml += `
<url>
<loc>${siteUrl}${page.loc}</loc>
<changefreq>${page.changefreq}</changefreq>
<priority>${page.priority}</priority>
</url>`;
}
// Add blog posts
for (const post of posts ?? []) {
const lastmod = post.updated_at?.split("T")[0] ?? "";
xml += `
<url>
<loc>${siteUrl}/blog/${post.slug}</loc>
<lastmod>${lastmod}</lastmod>
<changefreq>weekly</changefreq>
<priority>0.7</priority>
</url>`;
}
// Add products
for (const product of products ?? []) {
const lastmod = product.updated_at?.split("T")[0] ?? "";
xml += `
<url>
<loc>${siteUrl}/products/${product.slug}</loc>
<lastmod>${lastmod}</lastmod>
<changefreq>weekly</changefreq>
<priority>0.7</priority>
</url>`;
}
xml += `
</urlset>`;
return new Response(xml, {
headers: {
"Content-Type": "application/xml",
"Cache-Control": "public, max-age=3600",
},
});
});
Customize the table names (posts, products) and column names (slug, updated_at, published, active) to match your schema.
Step 3: Deploy the Function
supabase functions deploy sitemap --no-verify-jwt
The --no-verify-jwt flag is important because search engines need to access this URL without authentication. Your function will be available at:
https://<project-ref>.supabase.co/functions/v1/sitemap
Test it by visiting that URL in your browser. You should see valid XML.
Proxying Through RndrKit
Now you need /sitemap.xml on your domain to serve the Edge Function's response. RndrKit's redirect rules make this easy.
Set Up a Proxy Rule
- Go to your domain's detail page in the RndrKit dashboard.
- Click the Redirects tab.
- Add a new rule:
| Field | Value |
|---|---|
| Rule Type | Proxy |
| Source | /sitemap.xml |
| Target | https://<project-ref>.supabase.co/functions/v1/sitemap |
A Proxy rule (as opposed to a Redirect) serves the content from the target URL without changing the browser's URL. Search engines will see your sitemap at https://www.example.com/sitemap.xml as expected.
Disable RndrKit's Built-in Sitemap
If you previously generated a sitemap through RndrKit's dashboard, you should disable it so there is no conflict. Go to the Sitemap tab on your domain's detail page and disable automatic sitemap generation. This ensures that /sitemap.xml requests are handled by your proxy rule instead.
Testing and Submission
Test Your Sitemap
Open your sitemap in a browser to verify it looks correct:
https://www.example.com/sitemap.xml
You can also validate it with a sitemap validator like XML-Sitemaps.com Validator.
Submit to Google Search Console
- Go to Google Search Console.
- Select your property.
- Navigate to Sitemaps in the sidebar.
- Enter
https://www.example.com/sitemap.xmland click Submit.
Google will periodically re-fetch your sitemap to discover new pages. Since your Edge Function generates it dynamically, new content will be included automatically.
For a complete Google Search Console walkthrough, see the Google Search Console Setup guide.
Submit to Bing Webmaster Tools
- Go to Bing Webmaster Tools.
- Select your site.
- Go to Sitemaps and click Submit Sitemap.
- Enter
https://www.example.com/sitemap.xml.
Reference in Robots.txt
Add a sitemap reference to your robots.txt so crawlers find it automatically:
User-agent: *
Allow: /
Sitemap: https://www.example.com/sitemap.xml
Next Steps
- Google Search Console Setup -- Verify your site and track indexing
- Sitemaps -- Manage sitemaps through the RndrKit dashboard
- URL Redirects -- Learn more about redirect and proxy rules
- SEO Optimization Guide -- Meta tags, structured data, and more