← Work
UX Research2025UX 60504 - Accessibility and Universal Design · Kent State MS UX

Accessibility Audit Suite:
Music Tech Meets WCAG

WCAG 2.1 AA/AAAContrast AnalysisVoiceOverPDF RemediationSocial Media AuditCognitive Accessibility

The Problem

Music technology is a niche with a trust problem. The community is passionate, technically deep, and globally distributed - but the companies building the tools are often small teams shipping fast. Dedicated accessibility resources are rare. Formal a11y training is rarer still. The result is a category of software and web experiences that routinely fails users who rely on assistive technology, have low vision, or process information differently.

That's the landscape I audited across five methods during this course. The goal wasn't to produce a scorecard. It was to develop a professional audit practice - one I could carry into real design work.

My Approach

Rather than treating each assignment as a standalone deliverable, I structured the work as a multi-method audit suite across different artifact types: social media content, commercial websites, an interactive learning platform, a PDF document, and direct assistive technology testing. That variety was intentional. Accessibility failures don't cluster in one place. Neither should the audit.

  • Social media content audit - Instagram (Kent State College of Aeronautics)
  • Contrast ratio and color blindness analysis - Three music tech websites via WebAIM Contrast Checker
  • Assistive technology hands-on - VoiceOver testing on GroundNews web
  • PDF remediation - Adobe Acrobat accessibility checker on a scanned document
  • Cognitive accessibility research - Original reflection on AI, cognitive load, and machine readability

What I Found

Instagram: Accessibility as Afterthought

The Kent State College of Aeronautics feed had alt text on images, but all of it was auto-generated, not human-authored, not contextually accurate. Auto-generated alt text that misidentifies or vaguely describes an image is a different failure than missing alt text: it creates false confidence. A screen reader user gets a description, but the description is wrong. Decorative emoji were scattered throughout captions with no aria-hidden treatment, meaning screen readers announced each one by name. Story text overlays failed contrast requirements. Video content had no captions. The alt text problem in isolation is fixable with workflow changes. Combined with the rest, it signals that accessibility is being handled by automation rather than intention, which is its own category of problem.

Contrast Audit: Specific Failures, Not Generic Ones

The audit covered three music tech sites: Ableton.com, Fors.fm, and Audiothingies.com. Fors.fm was a surprise. I expected to find violations there and didn't. Ableton largely passed. The significant failures were concentrated at Audiothingies.com: gold-colored link text at a 2.4:1 contrast ratio (well below the 4.5:1 WCAG AA minimum under SC 1.4.3), red heading text at 4.3:1 that just misses the threshold, an "Out of Stock" notification and a CTA button that both fail contrast requirements, and small red header text (23px) that doesn't qualify for the large-text exception.

The takeaway: failures cluster in specific interactive elements and status indicators, not uniformly across a site. That makes them easy to miss in a casual review, and consequential for users with low vision who depend on those exact touchpoints.

VoiceOver: The Baseline Was Missing

VoiceOver testing on GroundNews revealed navigation order issues that made the reading experience non-linear in confusing ways - the focus sequence didn't match the visual hierarchy. Interactive elements were unlabeled. Landmark regions were missing, which meant screen reader users had no efficient way to jump between sections of the page. These aren't edge-case issues. They're the baseline.

PDF Remediation: Before/After Delta

Starting from a scanned document, I used Adobe Acrobat's accessibility checker to identify reading order failures, missing alt text on figures, and untagged tables. The remediation process required manually setting reading order, writing descriptive alt text, and tagging table structure. Documenting the before/after delta made the scope of the problem concrete in a way that a checklist alone doesn't.

Beyond WCAG: Cognitive Accessibility

WCAG organizes accessibility around four principles: perceivable, operable, understandable, and robust. That framework catches a lot. But it doesn't fully address cognitive load, focus support, or plain language as first-class concerns. To fill that gap, I built an evaluation methodology grounded in the W3C's COGA (Cognitive and Learning Disabilities Accessibility) task force work and their "Making Content Usable for People with Cognitive and Learning Disabilities" guidance. The goal was to extend the audit to cover the experience of users with ADHD, dyslexia, autism, memory impairments, and mental health conditions like communication anxiety.

I developed eight evaluation criteria that sit alongside traditional WCAG checks: plain language, clear iconography, consistent navigation, focus support, memory independence, error prevention, content scannability, and progressive disclosure. I also ran Flesch-Kincaid readability analysis and applied mobile care heuristics for cognitive accessibility. These criteria surface a class of failures that automated tools and contrast checkers will never flag, things like unclear error messages, inconsistent interaction patterns, dark patterns that exploit decision fatigue, and content that demands too much working memory to parse.

This is the part of accessibility work that most teams skip entirely. WCAG compliance is necessary but not sufficient. If your content is technically perceivable and operable but still confusing, you haven't solved the problem.

The Insight That Changed How I Think

The cognitive accessibility rabbit hole is where this work got interesting for me.

The research question I started with was straightforward: how do AI tools intersect with cognitive accessibility? What I landed on was something I hadn't seen stated plainly anywhere: accessibility best practices don't just help people - they help machines.

Clean semantic structure, descriptive alt text, logical reading order, properly tagged PDFs - all of it makes content more parseable by LLMs, search crawlers, and AI summarization tools. The two goals are the same goal.

I've been using AI tools to manage the cognitive load of a demanding grad school schedule. Not to do the work - to organize my thinking and reflect on what concepts mean. There's a real difference. But the experience made me aware of how inaccessible content creates friction not just for screen reader users, but for anyone or anything trying to extract meaning from a poorly structured document or page.

Universal design isn't a compliance exercise. It's a design posture: assume the full range of human variation from the start, and build accordingly. When I think about the AI tools I'm building and intend to build, that posture goes with me.

What a Professional Version Would Include

A class assignment has a clear scope boundary. A real audit engagement doesn't. Here's what the professional version of this work would add:

  • Stakeholder interviews before the audit - understanding the team's current workflow, toolchain, and capacity for remediation changes what you prioritize
  • User testing with assistive technology users - heuristic evaluation and automated checkers find a lot, but they don't find everything. Nothing replaces watching a screen reader user navigate in real time
  • Remediation tracking - a findings report with no tracking mechanism has limited organizational impact. A prioritized issue log with severity ratings, WCAG criteria references, and remediation owners is the real deliverable
  • VPAT documentation - for enterprise software or any product sold to institutions, a Voluntary Product Accessibility Template is often required
  • Longitudinal re-audit - accessibility degrades as content is added and codebases change. A one-time audit is a starting point, not a solution

Skills Demonstrated

WCAG 2.1 AA/AAA application and citationContrast ratio analysis (WebAIM)Color blindness simulation and evaluationVoiceOver / screen reader testingPDF remediation (Adobe Acrobat)Social media content accessibility auditCognitive accessibility and AI intersectionTechnical writing for accessibility findingsMusic technology domain expertise

Original Submission

View original PDF →