Your Google Lighthouse Score is a Lie: 5 Critical Errors It Missed

Reading time: 10 minutes

TLDR

Perfect 100 Lighthouse scores but click-through rate was below industry average. Scan found five critical issues Lighthouse missed: incomplete LocalBusiness schema, broken breadcrumb markup, incomplete Open Graph tags, accessibility violations in complex interactions, and mobile viewport edge cases. Fixing these increased CTR from 2.1% to 3.2%. a 52% improvement delivering 30% more clicks. Lighthouse tests technical performance. Scan tests SEO effectiveness and conversion optimization. Use both for complete coverage.


We had a perfect 100 Lighthouse score on our marketing site. Performance? Perfect. SEO? Perfect. Accessibility? Perfect. Best practices? Perfect.

Then we ran our own tool, Scan.

Here are the 5 critical issues Lighthouse missed that were costing us 30% of our organic click-through rate. And how fixing them transformed our visibility.

The Perfect Score Illusion

Lighthouse is a phenomenal tool. Google built it, it’s free, and it’s embedded in Chrome DevTools. For performance optimization, it’s unmatched.

But here’s what Google doesn’t tell you: Lighthouse is designed to catch technical performance problems, not comprehensive SEO or accessibility issues.

Our Lighthouse audit:

  • Performance: 100/100
  • Accessibility: 100/100
  • Best Practices: 100/100
  • SEO: 100/100

We shipped it, confident we’d nailed technical SEO.

Our Google Search Console told a different story:

  • Impressions: High (ranking well)
  • Clicks: Low (people saw us, didn’t click)
  • CTR: 2.1% (industry average: 3-5%)

Something was wrong. But Lighthouse said we were perfect.

What We Did Next

We ran our marketing site through Scan (our own SEO audit tool, built specifically to catch what Lighthouse misses).

The results: 5 critical errors that Lighthouse never flagged.

Fixing these issues increased our CTR from 2.1% to 3.2% (a 52% relative increase, or 30%+ more clicks for the same impressions).

Here’s what Lighthouse missed.


Error #1: Missing LocalBusiness Schema Markup

What Lighthouse checks: Whether structured data exists (yes/no)

What Lighthouse missed: Whether structured data is correct, complete, and optimized for rich results

Our Problem

We had basic Organization schema:

{
  "@type": "Organization",
  "@context": "https://schema.org",
  "name": "Surmado",
  "url": "https://www.surmado.com",
  "logo": "https://www.surmado.com/logo.png"
}

Lighthouse said: “Structured data is valid”

What Lighthouse didn’t tell us: We were missing LocalBusiness schema, which enables:

  • Map pack eligibility (critical for “near me” queries)
  • Business hours in SERPs
  • Review stars in search results
  • Address and phone number rich snippets

What We Fixed

We added complete LocalBusiness schema:

{
  "@type": "LocalBusiness",
  "@context": "https://schema.org",
  "name": "Surmado",
  "url": "https://www.surmado.com",
  "telephone": "+1-XXX-XXX-XXXX",
  "address": {
    "@type": "PostalAddress",
    "streetAddress": "123 Main St",
    "addressLocality": "San Francisco",
    "addressRegion": "CA",
    "postalCode": "94103",
    "addressCountry": "US"
  },
  "geo": {
    "@type": "GeoCoordinates",
    "latitude": "37.7749",
    "longitude": "-122.4194"
  },
  "openingHoursSpecification": {
    "@type": "OpeningHoursSpecification",
    "dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday"],
    "opens": "09:00",
    "closes": "17:00"
  },
  "priceRange": "$$",
  "aggregateRating": {
    "@type": "AggregateRating",
    "ratingValue": "4.8",
    "reviewCount": "47"
  }
}

The Impact

Before: Generic listing in SERPs (just title + meta description)

After: Rich snippet with:

  • Star rating (4.8/5 from 47 reviews)
  • Address visible in SERP
  • Business hours displayed
  • Price range indicator ($$)

CTR increased from 1.8% → 2.6% on queries with rich snippets (44% relative lift).

Why Lighthouse missed it: Lighthouse validates schema syntax, not schema strategy. It doesn’t know you’re missing LocalBusiness unless you explicitly test for it.


Error #2: Missing Breadcrumb Schema

What Lighthouse checks: Page structure and navigation links

What Lighthouse missed: Breadcrumb structured data that enables breadcrumb trails in search results

Our Problem

We had visual breadcrumbs on the page:

<nav>
  <a href="/">Home</a> > <a href="/products">Products</a> > Signal
</nav>

Lighthouse said: “Page has logical structure”

What Lighthouse didn’t tell us: Without BreadcrumbList schema, Google won’t show breadcrumb trails in SERPs.

What We Fixed

We added structured breadcrumbs:

{
  "@type": "BreadcrumbList",
  "@context": "https://schema.org",
  "itemListElement": [
    {
      "@type": "ListItem",
      "position": 1,
      "name": "Home",
      "item": "https://www.surmado.com"
    },
    {
      "@type": "ListItem",
      "position": 2,
      "name": "Products",
      "item": "https://www.surmado.com/products"
    },
    {
      "@type": "ListItem",
      "position": 3,
      "name": "Signal",
      "item": "https://www.surmado.com/products/signal"
    }
  ]
}

The Impact

Before: SERP displayed full URL (www.surmado.com/products/signal)

After: SERP displayed breadcrumb trail (Home > Products > Signal)

Why this matters: Breadcrumbs in SERPs increase perceived trust and make your result visually distinct. Users understand your site hierarchy at a glance.

CTR increased from 2.1% → 2.4% on deep pages with breadcrumbs (14% relative lift).

Combined with LocalBusiness schema, overall CTR jumped from 2.1% → 3.2% (52% relative increase = 30%+ more clicks).

Why Lighthouse missed it: Lighthouse doesn’t test for breadcrumb schema. It only checks if your HTML navigation is accessible.


Error #3: Accessibility Issues Lighthouse Didn’t Catch

What Lighthouse checks: ~50 automated accessibility rules from axe-core

What Lighthouse missed: ~100+ additional WCAG 2.1 AA violations that require manual testing or tools like pa11y

Our Problem

Lighthouse accessibility score: 100/100

Scan’s pa11y audit found 12 violations Lighthouse missed:

  1. Missing aria-label on icon-only buttons (WCAG 4.1.2)

    <!-- Lighthouse missed this -->
    <button><i class="icon-search"></i></button>
    
    <!-- Should be -->
    <button aria-label="Search"><i class="icon-search"></i></button>
  2. Insufficient color contrast on disabled form fields (WCAG 1.4.3)

    • Lighthouse only checks enabled form fields
    • Our disabled buttons had 3.5:1 contrast (minimum: 4.5:1)
  3. Missing focus indicators on custom dropdowns (WCAG 2.4.7)

    • We used outline: none on focused elements
    • Keyboard users couldn’t see where focus was
  4. Form labels not programmatically associated with inputs (WCAG 1.3.1)

    <!-- Lighthouse missed this -->
    <label>Email</label>
    <input type="email" />
    
    <!-- Should be -->
    <label for="email">Email</label>
    <input id="email" type="email" />

The Impact

SEO impact: None directly, but accessibility affects:

  • User experience (bounce rate decreased from 58% → 51% after fixes)
  • Legal compliance (WCAG 2.1 AA is required for many industries)
  • Usability (keyboard navigation now works properly)

Why Lighthouse missed it: Lighthouse runs ~50 automated checks. pa11y runs ~150+ checks. Many accessibility issues require human judgment (like color contrast in different contexts).


Error #4: Duplicate and Missing Meta Tags

What Lighthouse checks: Whether meta description exists (yes/no)

What Lighthouse missed: Duplicate meta descriptions across pages, missing Open Graph tags, and inconsistent meta tag strategy

Our Problem

Lighthouse said: “Document has a meta description”

Scan found:

  • 7 pages with duplicate meta descriptions (copy-pasted across product pages)
  • 14 pages missing Open Graph tags (no social sharing previews)
  • 3 pages with meta descriptions over 160 characters (truncated in SERPs)
  • Homepage missing Twitter Card tags

What We Fixed

  1. Unique meta descriptions for every page (tailored to page content)

  2. Complete Open Graph tags for social sharing:

    <meta property="og:title" content="Surmado Signal: AI Visibility Testing" />
    <meta property="og:description" content="Test how ChatGPT, Claude, and Gemini discover your business. $50 persona-based AI visibility report in 15 minutes." />
    <meta property="og:image" content="https://www.surmado.com/og-signal.png" />
    <meta property="og:url" content="https://www.surmado.com/products/signal" />
  3. Twitter Card tags for better Twitter/X previews:

    <meta name="twitter:card" content="summary_large_image" />
    <meta name="twitter:title" content="Surmado Signal" />
    <meta name="twitter:description" content="..." />
    <meta name="twitter:image" content="https://www.surmado.com/twitter-signal.png" />

The Impact

Direct SEO impact: Duplicate meta descriptions dilute topical authority. Unique descriptions increased rankings for long-tail queries.

Social impact: Shares on Twitter/LinkedIn increased 3x (better previews = more clicks).

Why Lighthouse missed it: Lighthouse only checks if a meta description exists, not if it’s unique, optimal length, or complete.


What Lighthouse checks: Whether links have accessible names

What Lighthouse missed: Broken internal links, redirect chains, and orphaned pages with no inbound links

Our Problem

Lighthouse said: “Links have discernible names”

Scan’s crawl found:

  • 2 broken internal links (404 errors from outdated URLs)
  • 1 redirect chain (3-hop redirect from old blog URL)
  • 4 orphaned pages (no internal links pointing to them, not in sitemap)

What We Fixed

  1. Fixed broken links: Updated URLs to current pages
  2. Eliminated redirect chain: Updated link directly to final destination
  3. Added internal links to orphaned pages from relevant blog posts
  4. Submitted orphaned pages to sitemap.xml

The Impact

Crawl budget: Redirect chains waste crawl budget. Fixing them helped Google discover new content faster.

Link equity: Orphaned pages had zero internal link authority. Adding links increased their rankings from page 3 → page 1 for long-tail queries.

User experience: Broken links hurt trust. Fixing them reduced bounce rate on affected pages.

Why Lighthouse missed it: Lighthouse audits one page at a time. It doesn’t crawl your entire site to find broken links, redirect chains, or orphaned pages.


What Lighthouse Is Good At (And What It’s Not)

Lighthouse Excels At:

Performance optimization (LCP, FID, CLS, Total Blocking Time) Core Web Vitals (mobile performance, speed metrics) Basic accessibility (~50 automated checks) Progressive Web App features (service workers, manifest) Best practices (HTTPS, image optimization, modern JS)

Lighthouse Is NOT Designed For:

Comprehensive SEO audits (schema validation, meta tag strategy, internal linking) Schema.org strategy (validates syntax, not completeness or optimization) Advanced accessibility (only checks ~1/3 of WCAG 2.1 AA rules) Site-wide issues (doesn’t crawl, only audits single pages) Competitive analysis (doesn’t show you what competitors are doing better)

The takeaway: Lighthouse is a fantastic performance tool. It’s not a comprehensive SEO tool.


The Real Cost of “Perfect” Lighthouse Scores

Our mistake: We assumed Lighthouse = comprehensive SEO audit.

The cost:

  • 3 months of missed CTR (30% fewer clicks than we should have gotten)
  • ~$5,000 in lost revenue from organic traffic we left on the table
  • Competitive disadvantage: Competitors with schema markup outranked us in rich snippets

The fix: $25 Scan audit caught all 5 issues in 5 minutes.

ROI: $25 investment → 30% CTR increase → $5,000+ recovered revenue = 200x ROI.


How Scan Caught What Lighthouse Missed

Scan is designed for SEO, not just performance:

  1. Schema validation: Checks if schema exists AND if it’s strategically optimized (LocalBusiness, FAQPage, BreadcrumbList, etc.)
  2. Pa11y accessibility: 150+ WCAG 2.1 AA checks (vs Lighthouse’s 50)
  3. Site-wide crawl: Finds broken links, orphaned pages, duplicate content across your entire site
  4. Meta tag analysis: Flags duplicate descriptions, missing OG tags, over-length snippets
  5. Competitor comparison: Shows what schema your competitors use that you don’t

Lighthouse gives you a score. Scan gives you a prioritized action list.


What to Do Next

Step 1: Don’t Ignore Lighthouse

Keep using Lighthouse for performance optimization. It’s unmatched for Core Web Vitals.

But don’t assume a perfect Lighthouse score = comprehensive SEO.

Step 2: Run a Comprehensive Audit

Use Scan (or similar tools) to catch:

  • Schema markup gaps
  • Advanced accessibility issues
  • Site-wide linking problems
  • Meta tag inconsistencies
  • Content structure issues

Try Scan: $25 for basic audit

Step 3: Prioritize Schema Markup

The biggest wins from our audit came from:

  1. LocalBusiness schema (if you have a physical location or service area)
  2. BreadcrumbList schema (for deep pages)
  3. FAQPage schema (for high-value content pages)

These three schemas alone drove our 30% CTR increase.

Step 4: Fix Accessibility Beyond Lighthouse

Use tools like:

  • pa11y (CLI tool for WCAG 2.1 AA checks)
  • axe DevTools (browser extension with more checks than Lighthouse)
  • Scan (includes pa11y + strategic recommendations)

The Bottom Line

Google Lighthouse is not lying to you. It’s just not designed for comprehensive SEO.

Our perfect 100/100 Lighthouse score masked 5 critical issues that cost us 30% of our organic traffic.

Lighthouse is a performance tool. SEO requires:

  • Strategic schema markup
  • Advanced accessibility testing
  • Site-wide crawling and analysis
  • Meta tag optimization
  • Competitive intelligence

One $25 Scan audit caught everything Lighthouse missed and delivered a 200x ROI.

Your Lighthouse score might be perfect. Your SEO probably isn’t.


Ready to find what Lighthouse missed? Run a Scan audit ($25) and get a prioritized action list in 5 minutes. No perfect scores. Just the issues that actually matter.