In celebration of Global Accessibility Awareness Day (GAAD) held on the 16th of May this year, Sylwia Frankowska-Takhari and I were invited to participate in the second session of a series of workshops that are part of a project called ‘Accessible Digital Futures’, a joint effort between Jisc’s National Centre for AI in Tertiary Education and the Glenlead Centre, led by Dr. Ann Kristin Glenster. The theme for this session was Growth of AI in and as assistive technology.
The first activity of the event was a discussion in small groups, centred on the use of AI in HE and how it is influenced by the intersection of developers, procurement, budget, policy makers, practitioners, and students.
We explored the implications of AI for accessibility, and the potential difficulties that may arise for students with disabilities if we approach AI with a hardline, no-tolerance policy. Whilst, understandably, teachers may be concerned about students presenting work that is not entirely their own, to completely ban the use of AI undermines the value it may serve to people with disabilities. For example, students with dyslexia may find AI tool, Grammarly, particularly helpful for structuring their thoughts and ideas. See the quote below taken from the Jisc report titled, ‘AI: Empowering Inclusive Education’:
“It structures my thoughts and helps me produce flowing written work, keeping me on point and preventing personal tangents. I strongly believe that ethical use of AI is crucial for creating equitable environments and levelling the playing field for neurodiverse learners.” (Nalina Brahim-Said, Student)
We also discussed the absence of clear accountability when it comes to accessibility-related issues, and how it is a common struggle across HEIs to identify next steps when a tool doesn’t meet disabled users’ needs. Reliance on open-source tools may do more harm than good for individuals with disabilities, as these tools may now always prioritise accessibility features, making it difficult for people with disabilities to use them effectively. Procurement teams may argue that it is not cost-efficient to invest in assistive technology tools if only one or two people will ultimately use them. However, we argue that accessibility should not be a numbers game; instead, we should focus on the impact. If a tool is invaluable to even one student with a disability, then this is reason enough to invest in it.
Ultimately, our group agreed that there is a need for professional bodies to establish clearer policies regarding AI tools that all HEIs abide by- but these policies should carefully consider the fact that AI can be an enabler for accessibility.
The second activity was another group discussion, this time using the lenses of theorist, practitioner, policy and product to determine how we might invoke empathy, curiosity and compassion when it comes to AI use in HEIs.
A particularly interesting point raised by one participant was that there seems to be a disconnect between the HEI view of AI, and the real-world corporate view. Students are constantly being warned against using AI for their university work, but in the workplace, we see companies striving to incorporate AI into as many processes as possible. Surely, we should be preparing students for the world of work now; so how do we empower students to use AI in an ethical, responsible way and how might we establish trust? Perhaps one way might be to introduce AI discussions/workshops on each course to encourage dialogue between teachers and students and establish what is and isn’t acceptable. Embedding AI within class activities might not only pique students’ curiosity but also provide them with clearer boundaries. In these workshops, students with disabilities may discover new technologies to assist them or share their experiences with AI to improve understanding amongst peers and staff.
In summary, AI itself is not inherently bad or good. Whilst concerns for originality are warranted, there is no denying that AI may help students with disabilities. The absence of clear policies and accountability when it comes to accessible AI tools may hinder the successful implementation of AI within an HE context. Encouraging more open dialogue between staff and students when it comes to AI may serve as a good first step in empowering students with disabilities.
—
Victoria Brew-Riverson, Digital Accessibility, LEaD
Keep up to date on this topic?
Jisc’s Accessible Digital Futures opening event
Join JISC’s AI community
Find out more about how City staff and students are engaging with AI