Skip to content
Back to Blog
SEONext.jsGoogle Analyticstechnical SEOlessons learned

We Built 3 Websites. None of Them Ranked.

Joshua Gutierrez9 min read

We launched three products this year at Axion Deep Labs. Three websites. All built on Next.js, all statically exported, all "optimized for SEO." Except they weren't. Not even close. Here's every real mistake we made and how we found out the hard way.

1. Our Entire Website Was Invisible to Google

This one still hurts.

Next.js has a file called loading.tsx. Drop it in your app directory and it shows a loading spinner while the page hydrates. Great for UX. Terrible for SEO.

That file creates a Suspense boundary around ALL page content in the static HTML export. When Google's crawler hit our site, it didn't see our homepage, our service pages, or our blog posts. It saw a loading spinner. On every single page.

We spent days fixing heading hierarchy, anchor text, meta descriptions. None of it mattered. The crawler was looking at an empty page the whole time. The fix was deleting one file.

Lesson: Don't trust your source code for SEO. Run the build, open the actual HTML output, and look at what the crawler sees. We wrote a script that counts words and checks for spinners in our built files now.

2. Google Analytics Said Our Tag Was Working. It Wasn't.

Google has a tag verification tool. You click "Test my website," get a green checkmark, and move on. We got the green checkmark. Two weeks later: zero data. Not low traffic. Zero.

The problem was Google Consent Mode v2. We'd implemented a cookie consent banner that defaulted analytics_storage to "denied" and only switched to "granted" when someone clicked "Accept." Most people don't click cookie banners. They ignore them.

Google's tag test checks if the script is present in the HTML. It doesn't check if consent mode is blocking data collection. Green checkmark. Zero data. Weeks of analytics, gone.

Lesson: If you're a US company, you probably don't need opt-in consent for analytics. Default to granted, make the banner an opt-out. And check your Realtime report after deploying, don't just trust the tag test.

3. We Had Five H1 Tags on One Page

Our homepage had a server-rendered article with an H1. A client component with another H1. Service cards with H1s inside them. Google doesn't know which one matters when there are five, so it kind of shrugs.

We also had H4 tags in the footer that appeared in the DOM before the H1. Heading hierarchy was completely broken. We didn't notice because visually it looked fine.

Lesson: One H1 per page. Everything else is H2 or lower. The H1 has to be the first heading in the DOM, not just the first one visible on screen.

4. Every Link on Our Site Said "Learn More"

Twelve service cards. Every single one had a "Learn more" link. Identical anchor text, twelve times, pointing to twelve different pages.

Google uses anchor text to understand what the linked page is about. When every anchor says "Learn more," you're telling Google twelve different pages are about... nothing specific.

We replaced them with descriptive anchors. "Explore our SEO services." "See our web development process." "Learn how lead capture works." Took 20 minutes.

Lesson: Every link on your page should have unique, descriptive anchor text. If you can't tell what the target page is about from the anchor text alone, rewrite it.

5. Client Components Killed Our Server-Rendered Content

Next.js server components render to HTML on the server. Client components need JavaScript to render, which means crawlers might not see them.

We had components using usePathname() from Next.js. That single hook forces the entire component to become a client component. Our content shell and footer were client components for no good reason. Just to highlight the current nav item.

We refactored them to server components, moved the pathname logic to a tiny client child, and suddenly our static HTML had actual content instead of empty divs waiting for hydration.

Lesson: If you're using Next.js, grep your components for "use client" and ask if each one truly needs to be client-side. One misplaced hook can make an entire page invisible to crawlers.

6. Our Privacy Page Had the Wrong Analytics ID

We switched Google Analytics properties and updated the tag in our layout. But our privacy policy still had the old measurement ID hardcoded in the text. Anyone trying to verify what we track would have seen a different ID than what was actually running.

Not an SEO issue technically. But it's a trust issue. And trust issues become bounce rate issues.

7. Our Sitemap Had Wrong Dates

Our sitemap.xml was generated at build time using new Date() in JavaScript. The build ran on a CI server with a slightly off clock. Our sitemap was telling Google that pages were last modified at times that didn't match reality. We hardcoded the dates.

8. 78 Images With Zero Metadata

Every image was optimized for file size and format. WebP, compressed, proper dimensions. But none had IPTC or XMP metadata. No titles, no descriptions, no keywords, no copyright.

Google Image Search uses this metadata for indexing. Social platforms use it as fallback when sharing. It's also how you prove ownership if someone takes your images.

We wrote exiftool scripts to tag every image with contextual titles, descriptions, keywords, author, copyright, and source URLs. 78 images. Took an hour.

9. Our Old Brand Name Was Still in Five Places

We built from a template and rebranded everything. Or so we thought. Five places still had the old brand name. Two had the old email. The hero section still had template copy. All buried in secondary pages, JSON-LD schema, and error states that we never checked.

Lesson: Find and replace across your entire codebase after a rebrand. Then do it again.

The 4 Things That Actually Matter

After months of fixing SEO across three sites, it comes down to this:

1. Crawlability. Can Google actually see your content? Suspense boundaries, client rendering, and JavaScript dependencies can make your pages invisible.

2. Rendered Content. What does the built HTML look like? Not your source code. The actual output. Check it.

3. Data Accuracy. Is your analytics collecting? Are your sitemaps correct? Are your meta tags matching reality? If your data is wrong, every decision you make from it is wrong.

4. Structure. One H1. Descriptive anchors. Proper heading hierarchy. Image metadata. These are boring. They also compound over every page on your site.

Get these right first. Then worry about keyword strategy and backlinks. We got a 98 Lighthouse score on one of our sites and it generated zero leads. Performance is necessary but not sufficient. The basics have to work.

Most of these issues are invisible unless you know where to look. We built a free tool that checks 60+ of them in seconds. No login required. Try DeepAudit AI free.

Ready to build a website that performs?

Let us audit your current site, identify the biggest opportunities, and build a plan to grow your traffic and leads.

Get in Touch