AI in STEM Education

AI means “any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals” (1). The discussions often centre around the mechanisms used in the perceiving, the maximising algorithms or the data sources which describe our environment. The wider discussion of what the goals of the system are is often unexplored, and by extension what the goals of the people who design and control such systems. We are told that “AI can drive efficiency, personalization and streamline admin tasks to allow teachers the time and freedom to provide understanding and adaptability” (2) without any qualification as to whether driving efficiency in education is actually desirable and whether it should be a primary focus. Or what “providing understanding” actually looks like.

Statements such as the one above have been used to hype many technologies and ideas of the past: computer based assessment, iPads, eLearning, video, MOOCs, social media. Whilst some of these technologies have had a significant impact on what education looks like (PowerPoint and Lecture Capture for example), teachers and lecturers are in fact more overworked, more stressed and as undervalued as ever. “Policy changes, budget cuts, fewer staff and bigger classes blamed for toll on teachers’ mental health revealed by figures compiled by Lib Dems” says a Guardian article from 2018 (3). The devotional focus and astronomical funding new technologies receive takes away the attention and money from conversations around workplace stress, mental health, increasing class sizes and continuing commercialisation of education. Instead of focusing on policy, people, quality and care, the conversations are about efficiency and scale.

Just as we saw in the stories about racist chat bots and algorithms from recent years (4), technology and AI is a reflection and formalisation of our existing understanding, models and structures we use to explain and control our environment. “Today’s AI technologies are powerful but unreliable. Rules-based systems cannot deal with circumstances their programmers did not anticipate. Learning systems are limited by the data on which they were trained. AI failures have already led to tragedy. Advanced autopilot features in cars, although they perform well in some circumstances, have driven cars without warning into trucks, concrete barriers, and parked cars. In the wrong situation, AI systems go from supersmart to superdumb in an instant.” (5) While in an educational institution it might not lead to such catastrophic mistakes, the effects might be more subtle but equally dangerous in the end.

There’s plenty of hyperbolic statements about AI in education. Microsoft blog says that “familiar Office applications are now ‘super charged’ by the power of the intelligent cloud, utilising Machine Learning (ML) to infuse AI driven features into the products and help teachers improve learning outcomes” (6). Exactly what that means and how it is meant to work is not exactly clear to me. What is clear however is how much of an effect AI research is having a on assistive technologies such as automatic translation, automatic captioning, image recognition, etc. Problems requiring great computational power or problems which rely on statistical modelling are making great progress. This in turn can provide access to education for students who were unable to access it before. That to me is the greatest promise for systems such as these. Letting us care for more and more students.  The Learning Analytics hype is also hopeful in the sense that it will force institutions to directly address “students at risk”, since the data will be available and plain to see and ignoring such indicators might be problematic.

We must remember that technology is just a tool. It can enable us to reach students in new ways, enable students to access learning they were unable to before, break down barriers and more. However we must remember that we are the ones wielding the tools and it is our intentions and goals which will ultimately shape education. AI might very well make education institutions more “productive” or “efficient” if that is how we choose to deploy it, but it might also make it more open and more welcoming.

 

Refences:

(1) Poole, David; Mackworth, Alan; Goebel, Randy (1998). Computational Intelligence: A Logical Approach. New York: Oxford University Press. ISBN 978-0-19-510270-3

(2) https://www.forbes.com/sites/bernardmarr/2018/07/25/how-is-ai-used-in-education-real-world-examples-of-today-and-a-peek-into-the-future

(3) https://www.theguardian.com/education/2018/jan/11/epidemic-of-stress-blamed-for-3750-teachers-on-longterm-sick-leave

(4) https://www.theguardian.com/technology/2016/mar/30/microsoft-racist-sexist-chatbot-twitter-drugs

(5) Scharre, Paul, “Killer Apps: The Real Dangers of an AI Arms Race”, Foreign Affairs, vol. 98, no. 3 (May/June 2019), pp. 135–44.

(6) https://educationblog.microsoft.com/en-us/2018/03/artificial-intelligence-in-the-classroom/

 

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *