Building a 100K+ page SEO machine with Next.js and Supabase

Programmatic SEO gets very interesting with database-driven templates with good caching.

I've been experimenting with programmatic SEO lately. Instead of writing hundreds of pages manually, you let the database generate them.

The opportunity

Pitchkit has an investor database for European startups. We have ~950 investors with rich data: sectors they invest in, stages they focus on, countries they're based in. The obvious pages are individual investor profiles. But what about "fintech investors in Germany"? Or "pre-seed VCs in Finland"?

Manually creating these pages would be insane. There are 20+ sectors, 12 countries, 6 investment stages, 7 investor types. The combinations explode quickly.

The experiment

What if the database just generates all of them?

I started with a simple Next.js page:

/investors/sector/[sector]/page.tsx

The [sector] is a dynamic segment. Next.js asks "what values can sector have?" and I tell it:

export async function generateStaticParams() {
    const sectors = await loadUniqueSectors(); // from Supabase
    return sectors.map(sector => ({ sector }));
  }

At build time, Next.js queries the database, gets all unique sectors, and generates a static HTML page for each one. Fintech page. SaaS page. Healthtech page. All pre-rendered.

Going deeper: combinations

Single dimensions are easy. But the real SEO value is in combinations. "Fintech seed investors" is more specific than just "fintech investors". More specific = less competition = easier to rank.

So I added two-dimensional pages:

  • /investors/sector/[sector]/[stage]/page.tsx
  • /investors/stage/[stage]/[country]/page.tsx
  • /investors/type/[category]/[country]/page.tsx

The static params multiply:

export async function generateStaticParams() {
    const [sectors, stages] = await Promise.all([
      loadUniqueSectors(),
      loadUniqueStages(),
    ]);
    const params = [];
    sectors.slice(0, 20).forEach(sector => {
      stages.forEach(stage => {
        params.push({ sector, stage });
      });
    });
    return params; // 20 × 6 = 120 pages
  }

20 sectors × 6 stages = 120 pages. Add country combinations and you're at 300+. Add investor types and you're pushing 500+.

All from one page template and a database query.

The caching trick

Here's what makes this scalable. These pages don't hit the database on every request. They're static HTML served from the edge.

But data changes. New investors get added. So I use ISR (Incremental Static Regeneration):

export const revalidate = 86400; // 24 hours

The page stays cached for 24 hours. After that, the next visitor triggers a background regeneration. They still get the cached version instantly, but Next.js quietly rebuilds the page with fresh data.

One million visitors? Same database load as one visitor. The DB only gets queried on cache misses.

Making Google happy

Static HTML is fast. Fast pages rank better. But there's more to SEO than speed.

Each page gets unique metadata:

export async function generateMetadata({ params }) {
    const { sector, stage } = await params;
    return {
      title: ${sector} ${stage} Investors in Europe | Pitchkit,
      description: Find ${sector} investors focusing on ${stage} stage startups...,
    };
  }

Plus structured data for rich snippets, breadcrumbs for navigation, internal links between related pages. The sector page links to all its stage sub-pages. The stage pages link back. Google loves internal linking.

The sitemap problem

500+ pages need a sitemap. But Next.js sitemaps are usually static. I needed dynamic sitemaps that query the database:

// /server-sitemap/investors-by-sector.xml/route.ts
  export async function GET() {
    const sectors = await loadUniqueSectors();
    const stages = ['pre-seed', 'seed', 'series-a', ...];
    const urls = [];
    sectors.forEach(sector => {
      urls.push({ loc: /investors/sector/${sector} });
      stages.forEach(stage => {
        urls.push({ loc: /investors/sector/${sector}/${stage} });
      });
    });
    return new Response(generateSitemap(urls));
  }

Google gets a fresh sitemap every time it crawls. New investors, new pages, automatically included.

Learnings

Programmatic SEO gets very interesting with database-driven templates with good caching. The hard part is:

1. Data quality - Garbage in, garbage pages out. Clean, structured data is everything.

2. Unique value - Each page needs to be useful, not just exist. I added FAQs, related links, pitch guides.

3. Internal linking - Pages need to connect. Otherwise they're orphans Google won't find.

The experiment is working. Pages are getting indexed. Traffic is starting to trickle in for long-tail queries. Will update when I have more data.

Subscribe to Mari Luukkainen

Join my free newsletter of 1000+ subscribers.
jamie@example.com
Subscribe