Yesterday, Jessica Wykes (City’s Accessibility Librarian) and I attended the opening event for a project called “Accessible Digital Futures,” a joint effort between Jisc’s National Centre for AI in Tertiary Education and the Glenlead Centre, led by Dr. Ann Kristin Glenster. This two-year initiative aims to significantly boost the responsible adoption of artificial intelligence within the tertiary education sector. The focus areas are comprehensive, spanning policy, procurement, adoption, and industrial development, with a strong emphasis on the “responsible” integration of AI technologies.
The project outlines a progression through five stages of AI maturity from initial understanding and experimentation to full operational, embedded, and ultimately, optimised transformation. It appears that most Higher Education Institutions (HEIs) find themselves between stages two and three, experimenting with AI and starting to operationalise these technologies.
A substantial portion of the workshop was dedicated to identifying gaps in our collective knowledge, particularly regarding the accessibility of AI. Through discussions in small groups, we delved into what constitutes safe, responsible, and ethical AI, how HEIs can effectively evaluate and adopt accessible AI solutions, what types of products should be developed, and the barriers to creating accessible AI.
Several critical issues were brought to light, notably the potential for AI to introduce accessibility problems due to its integration into technology and an over-reliance on AI for solving accessibility issues, which may inadvertently shift the responsibility onto students. This leads to a pressing question that I find particularly interesting: Whose needs, preferences, and aspirations should guide the development of accessible AI solutions, and who is involved in prioritising these issues? Currently, the approach to solution development often relies on the technical skills and profit potential perceived by developers, with minimal engagement from the disabled community. This approach is predominantly expert-driven, with limited direct involvement from users with disabilities.
Prof. Steve Watson, one of the key speakers, emphasised the crucial need for ongoing participatory co-design of learning environments. This approach ensures that development aligns with core inclusion principles of individualisation and adaptation, for which a deep understanding of the real needs of all users is necessary.
This discussion reminded me of a recent webinar by UsableNet on AI and Web Accessibility, which offered a perspective from a blind screen-reader user. The webinar highlighted specific AI technologies that are truly beneficial for navigating both physical and digital spaces but also pointed out the limitations of automated AI solutions in fully addressing accessibility on the web.
Among the current AI solutions, voice and image recognition technologies stand out as real game-changers, offering detailed descriptions of objects and surroundings and measuring distances. However, AI’s ability to interpret contextual, dynamic, and complex elements remains limited. This underscores the importance of involving assistive technology (AT) users in the development and testing of AI solutions, ensuring these technologies truly meet their needs.
In the realm of accessibility testing, AI enhances efficiency but does not significantly extend the scope of detectable errors beyond what automated testing already identifies. For instance, while AI can detect images without descriptions and suggest (accurate to some extent) alt texts, it cannot address more complex accessibility issues, such as inconsistent navigation, which may hinder users from effectively accessing a website.
In summary, while AI tools offer potential benefits in addressing certain accessibility challenges, the seminar reinforced the idea that direct involvement of people with disabilities in technology decisions, including the provision of edtech solutions, is paramount for achieving true accessibility and ensuring our work is both inclusive and impactful.
—
Dr Sylwia Frankowska-Takhari, Digital Accessibility, LEaD
Keep up to date on this topic?
Jisc’s Accessible Digital Futures: Growth of AI in and as assistive technology
Join JISC’s AI community
Find out more about how City staff and students are engaging with AI