Skip to main content
EqualAudit Home - AODA Compliance Audits & Technical Remediation
Web AccessibilityAODANext.jsWeb Development

Why a 100/100 Accessibility Score Still Lets Users Get Trapped

EA
EqualAudit
2 min read
Why a 100/100 Accessibility Score Still Lets Users Get Trapped

Your website scored 100/100 on an automated accessibility scanner, yet a keyboard-only user still can't submit your contact form. Automated tools catch roughly 30% of real WCAG violations—the rest require human-led, screen-reader testing to uncover.

Your website just scored a 100/100 on an automated accessibility scanner. But a keyboard-only user still can't submit your contact form.

How?

Automated tools are great for catching missing alt-text or contrast errors, but they only catch about 30% of actual WCAG violations. They read the code; they don't experience the site.

Here's a real-world "invisible trap" I see constantly:

A beautifully styled, custom dropdown menu for selecting an "Industry" on a lead form. The automated bot sees the <div> tags and gives it a pass.

But when a user with a motor impairment tries to navigate the form using only their keyboard, they tab to the dropdown, hit Enter, and... nothing happens.

The developer forgot to add a keyboard event listener for the Enter key. The user is trapped, assumes the form is broken, and leaves.

Lost lead. Lost revenue. AODA violation.

This is the critical gap between automated and manual accessibility testing. Here's a breakdown of what each approach catches:

  • Automated scanners find:
  • Missing or empty alt attributes
  • Insufficient colour contrast ratios
  • Missing form labels (when trivially absent)
  • Missing page language attributes
  • Manual + assistive technology testing finds:
  • Focus order that makes logical sense
  • Custom interactive components that don't respond to keyboard events
  • Screen reader announcements that are confusing or incorrect
  • Timing-dependent interactions that fail for slow users
  • Cognitive load issues that no algorithm can measure

The WCAG 2.2 Success Criterion 4.1.3 (Status Messages) is a perfect example of a requirement that automated tools almost universally miss. It requires that status messages—like "Your form was submitted successfully"—be announced to screen readers without receiving focus. An automated tool may not test whether your toast notification is wired up with the correct ARIA live region.

This is exactly why EqualAudit doesn't rely solely on automated scanners. We physically navigate your site using screen readers (NVDA/VoiceOver) and keyboard-only constraints to find the barriers that bots miss.

When was the last time you tried to navigate your company's checkout or contact form without using your mouse?

Give it a try today. If you get stuck, let's talk.

The Unplugged Mouse Test: Is Your Website Actually Accessible?

Want to know if your website is truly accessible? Unplug your mouse and try to submit your contact form using only Tab, Shift+Tab, and Enter. If you lose track of your place, you're missing focus states—and likely violating WCAG 2.2.

Accessibility Is a Revenue Strategy, Not a Charity Project

People with disabilities in Canada control an estimated $55 billion in annual purchasing power. If your checkout flow traps keyboard users, you're not just failing an audit—you're actively turning away paying customers at the exact moment they want to give you their money.