Making Digital Accessibility Work In The AI Era cover art

Making Digital Accessibility Work In The AI Era

Making Digital Accessibility Work In The AI Era

Listen for free

View show details

About this listen

Dr Tuboise Floyd hosts Dr Michele A Williams to explain why digital accessibility failures present across 97 percent of the web create equity resilience and trust risks that AI can magnify at scale. Williams contrasts medical vs social models of disability addresses ableism and language person first vs identity first and argues checklists cannot replace lived experience or disabled participation in UX research and leadership. They discuss how inaccessible code tools and AI trained on inaccessible data produce issues like missing labels, broken keyboard paths and poor semantic structure. And warn against disability dongles that add tech instead of removing systemic barriers. Dr. Williams outlines a practical 90 day plan establish a baseline with scans and process mapping.GUESTDr Michele A WilliamsMaking Accessibility Work UX and Accessibility Consultant Author of Accessible UX Research Smashing Mediahttps://mawconsultingllc.comhttps://www.linkedin.com/in/micheleawilliams1Accessible UX ResearchPublisher Smashing Magazinehttps://www.smashingmagazine.com/2025/06/accessible-ux-research-pre-release/YouTubehttps://youtu.be/pxXLNsbyJhc?si=Dt9mf2HK4AtyCx6_00:00 Accessibility Wake Up Call00:57 Meet Dr Michele Williams02:07 Equity Resilience Trust04:01 Disability Mindset Shift05:59 Why Lived Experience Matters07:14 Person First vs Identity First13:01 AI Promise and Harm15:23 Social Model In Practice19:58 Beyond Screen Readers25:02 Exclusion Inside Real Teams26:58 Semantic Code Chaos28:32 Standards Lag Tech29:12 Siri Zoom Panic31:23 Disability Dongles33:36 AI Hype Reality37:25 Beyond Checklists40:32 90 Day Baseline42:30 Change Defaults44:17 Normalize Inclusion46:47 Nothing About Us49:13 One Action This Week50:35 Closing CreditsSUBSCRIBE & SUPPORTSubscribe now to lock in the feed. This isn't just content; it's a continuing briefing for the Builder Class.Support Human Signal: Help fuel six months of new episodes, visual briefs, and honest playbooks. 🔗 https://humansignal.io/supportEvery contribution sustains the signal.ABOUT THE HOSTDr. Tuboise Floyd is the founder of Human Signal, a strategy lab and podcast for people deploying AI inside government agencies, universities, and enterprise systems. A PhD social scientist and former federal contracting strategist, he reverse-engineers system failures and designs AI governance controls that survive real humans, real incentives, and real pressure.PRODUCTION NOTESHost & Producer: Dr. Tuboise Floyd Creative Director: Jeremy JarvisTech Specs: Recorded with true analog warmth. No artificial polish, no algorithmic smoothing. Just pure signal and real presence for leaders who value authentic sound.CONNECTLinkedIn: linkedin.com/in/tuboise Email: tuboise@humansignal.io https://humansignal.io/TRANSCRIPTFull transcript available upon request at hello@humansignal.ioTAGS/KEYWORDSAI Governance, Risk Management, Innovation, Project Cerebellum, CIO Leadership, AI Ethics, Military Technology, Cybersecurity, AI Policy, Enterprise AI, Government AI, Technology LeadershipHASHTAGS#AIGovernance #RiskManagement #Innovation #AIPolicy #CIOLeadership #AIEthics #HumanSignal #MilitaryTech #Cybersecurity #ProjectCerebellumLEGAL© 2026 Dr. Tuboise Floyd. All rights reserved. Content is part of the Presence Signaling Architecture® (PSA), GASP™ and L.E.A.C. Protocol™.Companies mentioned in this episode:Smashing MagazineAccessibeTakeaways:The staggering statistic reveals that 97% of the web remains rife with accessibility barriers that hinder disabled individuals.Accessibility is not merely a compliance issue but a vital consideration that impacts user experience and organizational culture.To create truly inclusive products, it is essential to incorporate the perspectives and experiences of disabled individuals throughout the design process.Artificial intelligence must not be viewed as the sole solution for accessibility; rather, it should be integrated thoughtfully with human expertise and oversight.This podcast uses the following third-party services for analysis: OP3 - https://op3.dev/privacy
No reviews yet