Smart Schools: AI technology and Education Futures as Imagined on Screen

by Amy C. Chambers and R. Lyle Skains


Long before we had the technology to make it a reality, humans have been fascinated by the concept of artificial intelligence (AI): a completely human-made being that can learn and make decisions independently. We have wondered how they would look, sound, and act. Most often, and in true self-centric form, we have wondered how they would feel about us. Less frequently do we wonder how our speculations reflect society’s perceptions of those humans whose roles AIs might assume. One of those roles is education and pedagogy: teachers. The COVID-19 pandemic, and the rapid societal changes it has wrought, has once again exposed troubling attitudes about and treatment of teachers, and provides an excellent opportunity to compare their role to that of advancing technology.

On screen, artificial intelligence is most frequently presented as analogous to the Creator-Human relationship: humans creating intelligence in our image. Fiction explores how we might come to use such sentient machines, usually shaping them into human forms with human skills and thought patterns, as synthezoids with mechanical bodies and AI minds. Frequently, these stories ask us: where is the line between human and machine? When does a machine, like Pinocchio, become a “real” human? Many works explore this possibility, including Lost in Space (1965-68; 1998 [film]; 2018- ), Star Trek: The Next Generation (1987-94), and AI: Artificial Intelligence (Spielberg 2001). They may also touch upon, as Battlestar Galactica (2004-09), Humans (2015-18), the Terminator franchise, Ex Machina (Garland 2014), and Blade Runner (Scott 1982) do, the seemingly inevitable consequences of ignoring AI beings’ Pinocchio moments, so that they become Frankenstein’s monster, wreaking havoc on their own creators. These representations represent both our utopic hopes for AI (that we will be able to enhance and improve our lives through its employment) and our dystopic fears (that our created slaves will rise up against us). Rarely are intelligent machines presented realistically: computer programs that analyse images, monitor environments, digitise printed text, process natural language, and model economic scenarios, to name a few. Information services, such as encyclopaedias, language translation, and, yes, education and pedagogy, are among the few more down-to-earth depictions of AI systems in media.

AI: Artificial Intelligence (Spielberg, 2001)

The pandemic has literally brought home a better understanding of what current-day, non-imaginary teachers do on a daily basis. This is especially true in primary/junior school level, where (hopefully) we will be able to retire the idea that skilled childcare educators are “glorified babysitters”, along with the idea that essential domestic and retail workers do not deserve to be paid a living wage (Gibson 2017). Alongside the belittling “babysitter” image, teachers’ roles are often compared to machines—interfaces delivering content for a standardised curriculum as part of a national service. When the impact of the Covid-19 pandemic became increasingly apparent, teachers were asked to flip from in-person to online learning with expectations of managing distance learning and offering tech support to pupils and their guardians. The blurred boundaries of the expectations placed on teachers were made visible, and the emotional burden placed on these individuals supported by the notion that teaching is a calling rather than career.

One of the more influential representations of AI/smart-tech futures is almost 60 years old: The Jetsons [1962-1963]. This TV animated sitcom introduces Rosie as the mature robot house helper whose tasks include after school educational support (she has encyclopaedic knowledge but is used for domestic work); the show’s imagined education future at Little Dipper School gives us the subtly named Ms. Brainmocker. Brainmocker is a shrill, matronly female-presenting robot who barks instructions and harshly assesses the children with retrofuturistic report cassettes (Novak 2013). She is a dystopian vision of the future of education, replicating the satirical suggestion that teachers are mainly angry disciplinarians of Matilda’s (Dahl 1988/DeVito 1996) Mrs Trunchbull variety, for whom children are an inconvenience, or kindly mother-like Miss Honey figures who love their fragile charges. Class of 1999 (Lester 1990) and The Umbrella Academy (2019- ) offer two more iterations of AI-teachers reflecting this disciplinarian/mother dichotomy, each in fully realised synthetic human forms. In Class of 1999, a perhaps under-appreciated film commentary on the role of teachers in underserved schools, a high school principal brings in three former soldier androids to control a student population spiralling into gang and drug-related violence; these “teachers” utterly fail to teach, and even fail in their disciplinarian roles, instead visiting violence and even death upon their charges. Conversely, in adopting (conscripting?) seven children with super-hero capabilities into his Umbrella Academy, billionaire Sir Reginald Hargreeves (Colm Feore) creates an android mother (Jordan Claire Robbins) for them; her sole purpose is to meet the children’s emotional and non-powers-related pedagogical needs.

In a more recent example found in the Star Trek universe, Discovery (2017- ) presents formative flashbacks to protagonist Michael Burnham’s (Sonequa Martin-Green) childhood as a human growing up in the Vulcan Learning Centre. These futuristic visions of Vulcan education separate pupils into subterranean learning pits, providing one-to-one (and only one-to-one-AI) education to each student through voiced-holograms. As Sarek’s (James Frain) ward and Spock’s (Ethan Peck) adopted sister, Micheal is the first human to attend the academy; she is traumatised by the standardised curriculum that forces her to answer questions about the Klingons who allegedly killed her parents. While the utopic ideal of AI poses these replacement teachers as the ultimate in tailored educational needs designed for each individual student, this thoughtful representation points out a potential pitfall: the lack of human agency and emotional connection can result in a detrimental pedagogic relationship, rather than an ideal one.

The Matrix (Lana Wachowski, Lilly Wachowski, 1999)

Agency and emotional connections are, of course, crucial to education. More than information regurgitation and retention, good education also includes social contact and personal development (Popenici and Kerr 2017). Yet brain-computer interfaces, where information is neurally implanted, is a common theme in literary and media science fictions (Andrews 2015), wherein characters can be given encyclopaedic knowledge and “taught” high level skills without the time and effort required in a traditional learning environment. Appearing across a broad range of science fictions including Brave New World (2020), Red Dwarf (1988-99; 2008- ), Dollhouse (2009-10), Chuck (2007-12) and, perhaps most famously, the Wachowski sisters’ genre-defining The Matrix (1999). The brain is treated as a computer that can be upgraded and reprogrammed with new skills: the learning process is removed; knowledge (how to diffuse a bomb) and physical skills (martial arts) are programs that can be instantly uploaded. These education and learning discussions are more often focussed on adults rather than children. Downloading information directly into the brain is “seductively easy” and perhaps efficient, but it is stripped of the creativity and collaboration that allows for knowledge advancement (Andrews 2015, p.346). Learning is a process, an action stemming from the desire to inquire and advance our current knowledge; by removing the act and performance of learning do we risk promoting passivity?

The brain-to-computer interface is so common in science fiction media that it feeds fears for many teachers. Teaching in further and higher education during the pandemic necessarily evolved to a blended-learning pedagogic approach, where material is delivered through pre-recorded lectures/podcasts and face-to-face, albeit virtual, discussions. Teaching professionals have often expressed fears that these recordings could be retained and replayed in future semesters, removing the need for the “originals” (the teachers themselves). What these fears convey is a lack of trust in upper management’s recognition of the wide array of interactions and skills that comprise true teaching. Science fiction media grotesquely simplifies the act of teaching as an information dump, enabling teachers to be replaced by content delivery platforms that are “not self aware and not learning” (Kupferman 2020, p.7) — replicators rather than innovators. True educators innovate. They recognise and respond to individual learning needs and circumstances; isolation and unmediated content delivery as an imagined education future is perhaps desirable for upper management (who would love to cut the meagre costs represented by actual people), but would be so clearly disastrous for actual pedagogy as to be outside the realm of realism.

Doctor Who (BBC, 1963 to present)

Historically, the AI interface is gendered with real-world virtual assistants and imagined future smart-homes (and spaceships) voiced by “female” voices (see SARAH the smart-home in Eureka [2006-12]; Lucy the ship in Killjoys [2015-2019]; and the personification of the TARDIS, Idris (Suranne Jones), in Doctor Who [2005- ]). These figures are what science, technology, and communications scholars Yolande Strengers and Jenny Kennedy (2021) term “smart wives”: assistants like Siri and Alexa who support domestic life in a way that aligns with the 1950s’ idealised housewife. These patriarchal gender dynamics are also mapped onto our imagined education futures in visual media, with teachers presented as carers, facilitators, and homework helpers all without the need for emotional reciprocity or human respect for their work. The difficulty of teaching in the digital setting has seen educators feeling isolated from an experience that is usually interactive and reciprocal. Those using online platform like Zoom and Teams find that they perform to blank screens and impersonal icons as participants choose to turn off their mics and cameras; the virtual learning pit (void) is a lonely one-sided experience exacerbated by visions of the profession as something that could “easily” be replaced with digital assistants and robot schoolmarms.

As with many discussions of gender, the focus is often on women. But the naive stereotypes and limited visions of the teaching profession affect men just the same. The feminisation of assistive technologies—with which future images of teachers align—also limits the role and range of men in Western society. The deep-seated misogyny, defined by Kate Manne (2018) as any attitude or action that serves to uphold the dominant patriarchy, about “feminine” roles in society, both statistically and in terms of perception and representation, also translates to a toxic masculinity that says men cannot be teachers, cannot show emotions, and should not excel at pastoral care for colleagues and students. This is perfectly framed in The Umbrella Academy: Sir Reginald outsources to an AI android the pastoral caring and normal educational needs that children need to be healthy, mentally balanced, and integrated into society. This significantly limits our conception of AI, particularly given that AIs represented in media are almost entirely written by and/or directed by men. That many AIs in film and TV are presented in caring and pedagogic roles (nursing, mothering, teaching) tells men that these crucial societal roles are so disrespected and burdensome that they should look to robots to relieve them of the onus, both for themselves and for others.

Though artificial intelligence has a long way to go to achieve the sentient levels explored in film and television, education is nonetheless one of the most promising areas for current and potential AI applications. Our cultural narratives show that we have a very narrow and problematic perception of the role human teachers play in our society; without awareness and dedication to combat these biases, any new tools we create to supplement educational needs will suffer the same issues.


References

Andrews, G. “Gus”, 2015. To Boldly Go Where No Learner Has Gone before: Independent Inquiry, Educational Technology, and Society in Science Fiction. E-Learning and Digital Media, 12(3–4), pp.343–360.

Gibson, M., 2017. Childcare Educators ‘Glorified Babysitters’? Belonging: Early Years Journal, 6(1), pp.45–48.

Kupferman, D.W., 2020. I, Robot Teacher. Educational Philosophy and Theory, 0(0), pp.1–10.

Manne, K., 2018. Down girl: the logic of misogyny. New York, NY: Oxford University Press.

Novak, M., 2013. The Jetsons Get Schooled: Robot Teachers in the 21st Century Classroom. Smithsonian Magazine. [online] Available at: https://www.smithsonianmag.com/history/the-jetsons-get-schooled-robot-teachers-in-the-21st-century-classroom-11797516/ [Accessed 20 Mar 2021].

Popenici, S.A.D., and Kerr, S., 2017. Exploring the Impact of Artificial Intelligence on Teaching and Learning in Higher Education. Research and Practice in Technology Enhanced Learning, 12(1), p.22.

Strengers, Y., and Kennedy, J., 2021. Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot. Cambridge, MA: MIT Press.


Author Biographies

Dr Amy C. Chambers is a Senior Lecturer in Film and Media Studies at Manchester Metropolitan University. She is a science communication and screen studies scholar, and her research examines the intersection of science and entertainment media, women’s filmmaking, medical horror and science fiction. Amy's current project is on Women-directed science fiction where she will watch/review every SF directed by a woman – follow the project on her website (amycchambers.com #WomenMakeSF) and via the podcast co-hosted by Dr Lyle Skains.

Dr Lyle Skains is a Senior Lecturer in Health and Science Communication at Bournemouth University, where she researches narratives for science communication, conducting practice-based research into writing, reading/playing, publishing digital and transmedia narratives, and how these can be used for health and science communication. Her digital fiction can be found at lyleskains.com; articles in Convergence, Digital Creativity, and Computers and Composition; and books with Cambridge UP (Digital Authorship), forthcoming Emerald (interdisciplinary scicomm) and Bloomsbury (convergent evolution of mainstream digital fiction).