An Introduction to Narrative Generators: How Computers Create Works of Fiction

Cover

An Introduction to Narrative Generators: How Computers Create Works of Fiction

Rafael Pérez y Pérez and Mike Sharples

Oxford University Press, 2023
ISBN: 9780198876601
https://global.oup.com/academic/product/an-introduction-to-narrative-generators-9780198876601

Reviewed: David Longman, 11 August 2023

This is a book about stories, how they are creatively developed from various textual elements and specifically how computers can help us understand how we do such a marvellous thing. It is a follow-on from the authors’ previous book ‘Story Machines’ which takes a broader and more historical approach to the topic of computational machines – digital or otherwise – which can generate coherent texts that human readers might recognise as stories. ‘Narrative Generators’ is more focused on detailed explanations of how computers can be made to produce stories from given elements such as characters, events, actions, plots and even themes in time-based sequences following a traditional format of beginning, middle and end.

However, this is not a book of literary theory in the usual sense and it does not approach its subject from the perspective of semantics, linguistics, poetics or aesthetics although all these appear here in some form. Instead it aims to describe how stories are built up from simpler components of text or data using procedural  rules to combine them.  Such techniques are not restricted to fiction and have already been in use for some time, often invisibly, to automate aspects of journalism, e.g. the production of news, some of which is genuine but some that is itself a form of fiction or ‘fake news’.

For these auth;ors the rationale for building narrative generators (or story machines) rests on the idea that we “live through stories” or narratives about ourselves, our lives and the worlds we occupy. Stories suggest not merely the act of imaginative creation – though that is the focus of this book – but a fundamental way in which our minds work to make sense of our experience of the world, a fundamental feature of human thought and cognition through which we shape our lives intellectually, emotionally and socially.

Understanding this process is a key reason for the scientific and computational study of how stories are made. We still know too little about how ‘narrative generation’ is formulated in human thinking and here the authors remind us of the important role that the science of computation has to play in furthering our knowledge about the workings of the human mind:

“ … Computer science, and in particular artificial intelligence, is a powerful tool that can contribute to this endeavour. This book describes how computer programs can generate narratives and how studies of computational narrative can illuminate how humans tell stories.” (p2)

Here then is the scientific position for AI as a tool for understanding ideas about how we think, a stance that harks back to the very origins of Artificial Intelligence as an area of scientific study: building models to test hypotheses and theories of minds in order to understand them. This endeavour requires exhaustive detail in the construction of models, of course, but also and, importantly, models of story making which should produce outputs that are coherent, aesthetically appropriate or intelligible to an audience. They must ‘work’ to some extent and be recognisable to the human reader as stories which even to a small degree must reach back into human experience and traditions, they have to mean something.

In this regard the book’s subtitle, “How Computers Create Works of Fiction” is an important qualifier because, by exploring fiction specifically, it avoids the many problems of truth, bias and, to a large extent, the issue of informational garbage that such machines may produce. Fiction, after all, is essentially not bound by the logic of truth but ruled mostly by cultural expectations. A word salad produced in the name of fiction is just that and it can be ignored or admitted as the conventions of taste, aesthetics, and perhaps marketing, might allow. (Fiction generates its own ‘fake news’ of which The Protocols of the Elders of Zion is a famous, and profoundly appalling, example). None of this directly concerns the purpose of this book but the computational work described here can help to explain how such content is manufactured particularly in the present age of automated ‘intelligence’.

This point should be emphasised. The aims of the examples and techniques described here are not merely to devise cognitive machines that can imitate the creativity that lies behind story making and thereby add to the countless screeds of deathless prose that advance across our cultural landscapes (we have GPTs for that!).  This is a book that illustrates a search for an understanding, an explanation, of important aspects of human creativity through which we shape our lives intellectually, emotionally and socially. It is first and foremost a book of experimental science, a testing ground for understanding theories and hypotheses about how our creative minds work.

The book takes a specialised approach in two ways. First, it does not assume advanced knowledge of computer programming (although most of the examples under discussion have been created with sophisticated programming techniques) but second, it does presume that the reader is able interpret dynamically what are essentially static accounts of how such computational machines can be constructed. For this a basic understanding of how computer programming works is helpful. An obvious analogy might be to suggest it is rather like trying to understand the experience of driving a car by reading a Haynes manual!

In this respect ‘Narrative Machines’ at times requires considerable acts of imagination and points up the challenges that this book can present to the unaccustomed reader because computational models such as these can only be tested and their effects seen by running  them programmatically. This is, if you like, the key component of how the ‘computational method’ does it work.

However, logical thinking, a constructionist turn of mind combined and an interest in the general aims of computational science is enough to grasp the modelling that is going on here. Note that there are no claims here about ‘machine intelligence’ and scepticism towards that particular trope is entirely consistent with the idea that computational machines can generate insights into the mental processes that underlie imaginative writing.

The digital transformation of social communication and information practices across the entire spectrum of everyday genres including financial, commercial, scientific, political and artistic has been gathering pace for some time, perhaps longer than many of us realise, and the kind of techniques discussed in this book have already effected significant changes in our use of  email, spell checkers, search engines and word processors. It is no longer a pun to refer to some of these changes as transformational because that very term now appears in the descriptive name of  one of the most prominent of the current crop of digital artefacts Generative Pre-trained Transformers or GPTs.

This raises an interesting and crucial point about the history of all these endeavours to create machines which, in various ways, represent or mimic human language use. As already noted Narrative Generators stands at a juncture between what might now be termed ‘traditional’ AI where the aim is to understand how human cognition works by building dynamic models of cognitive processes. As noted by the authors this might be thought of as the representational approach, building testable models of how minds work, albeit in restricted domains such as storytelling. To build such models relies on science where theory building and hypothesis testing are paramount. Chapters 1 to 8 in this book take us through that approach and various projects that are described here are based on postulations about the elemental components from which stories are made.

The book is well structured and progresses from describing relatively simple and accessible models using templates much like online forms that are such a familiar part of everyday online life. Phrasal substitutions are made from a simple database stored in a spreadsheets containing lists of character names, descriptions of actions and locations, and can be developed to be surprisingly subtle even if in their simplest form they are limited, ultimately, by a lack of flexibility. Chapters 2-8 takes us through increasingly sophisticated approaches that retain something of this template model but develop it to allow for greater variation and subtlety. Certain kinds of reasoning, computationally expressed, can be introduced into the manner in which characters, events, and outcomes can be ‘substituted’ to increasingly sophisticated levels such as authorial intentions and thematic frameworks. 

But today we have entered the age of what might be termed neural AI. The most prominent, and for many perhaps the more disturbing, example is the power of the ‘neural net’ built from artificial ‘neurons’. The difference is profound. Its most visible incarnation is in the form of GPTs. Whereas in Chapters 1-8 we can trace much of the intellectual reasoning and computational engineering behind the design and implementation of story making machines, the arrival of the artificial neuron outlined in Chapters 9 and subsequently in chapters 10 and 11 its role in the architecture of artefacts such as GPT-4 offers little by way of any structural analysis of how stories are generated. All we have is a digital analogue for what is presumed to be the biological basis of brains and hence of thought – the neuron. And here too is a disjunction between the attempts to model intellectual processes described in other sections of the book for neural nets rely almost entirely on sophisticated statistical mathematics. Astonishing they may be but we are guessing, wondering yet somewhat baffled, as to how they achieve their linguistic capability. They are yet another example of Arthur C Clarke’s famous dictum that “any sufficiently advanced technology is indistinguishable from magic.”

There is history here of course and the idea of an artificial neuron, as the authors point out, reaches back into the origin stories of Artificial Intelligence, probably arising before the emergence of cognitive theories about how minds work which simply enabled more tractable approaches to computational modelling. The only ‘model’ that neural nets provide is an electronic analogue for the brain in so far as we agree that (a) the brain comprises neurons and (b) that these neurons are the building blocks of minds (Minsky’s famous meat machine) but we do not have a model of how a GPT can tell us a story, only that it can. In other words while GPTs offer a performance of story making, as yet they do not provide a model of how stories are made. Perhaps, ironically, a new science is required if we are to understand this – an AI of AI!

This may cast us back to one of the early debates in the history of the cognitive sciences and emergent AI when, in 1959, Noam Chomsky critiqued the Skinnerian approach to understanding language generation. In simple terms, Skinner theorised that all learning, and language learning in particular, arises from the acquisition of examples that are ‘reinforced’, i.e. made ‘correct’, by various social means (parental engagement, pedagogical practice etc.). Chomsky on the other hand argued that such a mode of learning cannot explain why grammatically correct but nonsensical sentences are nevertheless possible such as “green ideas sleep furiously” (or, from an earlier age, Lewis Caroll’s Jabberwocky poem). For Chomsky there had to be an underlying, conceptual ‘grammar machine’ on which all reinforcement relies and by which it is calibrated, a ‘machine’ that could in principle be described and explained.

This book might not appeal to an audience for whom language and storytelling are regarded as fundamentally human activities and an expression of our non-mechanical spirit. However, it is timely and relevant in today’s cultural landscape where the machine generation of linguistic content has reached new heights of sophistication often indistinguishable from that produced by human writers. Restricting itself to the understanding of how fictional, imaginative texts are produced allows for a focus on the type of cultural object in which we are immersed on a daily basis.

There is however an audience ranging from computer scientists engaged in this type of investigative work to, importantly, teachers at all levels of school and university education for whom the role of computation in understanding how machines can be made to do these things does not deny the ineffable qualities of human creativity. Indeed, a rather striking, if understated, observation throughout this book is how often practical human intervention is required to edit the output of the various systems described including GPTs.

All too often, and all too obviously, the stories that they produce cry out for an editorial hand, for a human to come along and improve it. This weakness is a strength because the limitations of our computationally generated narrative models are difficult to hide. In contrast, tools such as GPT that rely on mathematically driven models of neural architecture may present a greater risk to human creativity because for the illusory finesse with which they produce refined grammatical texts may all too easily obscure underlying inaccuracies of fact or distorted moral and cultural values.

Teaching Machines: The history of personalised learning


Teaching Machines Cover
Teaching Machines: The History of Personalized Learning
Author: Audrey Watters
Publisher: MIT Press
Published: August 2021

Review date: 10th October 2021
Posted: 26/12/21

David Longman, TPEA   (word count: 1600)

An important theme to emerge from reading ‘Teaching Machines’ by Audrey Watters (MIT Press, 2021) is that the ‘industrial age’ of mechanised educational technology has not come to an end, as some might believe. Instead, it is thriving.

In her introduction Watters summarises, among others, the argument of Sal Khan (the creator of Khan Academy) that for the first time [my italics] online learning enables a truly personalised approach to learning in or out of school. At last, online learning enables us to break free from the stifling effects of an outmoded education based on regimented, bureaucratic organizations that fail to enable effective learning. It is, however, a too familiar critique of education that has served its time as a rationale for disruptive innovation in education, one often taken up by those proponents of learning technology who argue that schooling is somehow broken.

As Watters argues and aims to demonstrate in this book, the ‘end of history’ story on which Khan bases his claims is wrong. She goes further. Not only is he wrong about the unchanging face of schooling but he denies history. It reflects a general ‘Silicon Valley’ inspired ideology that the past is irrelevant and that only the future matters. In this way, old ideas can be recast as unprecedented, innovative and disruptive to a moribund educational system:

“What today’s technology-oriented education reformers claim is a new idea – ‘personalised learning’ – that was unattainable if not unimaginable until recent advances in computing and data analysis has actually been the goal of technology-oriented education reformers for almost a century.” (p9)

Teaching Machines is a well-researched and largely well written illustration of this history, providing critical commentary on the notion that machines can be so designed as to afford effective learning with minimal intervention from teachers. It also illustrates that implementing machine-based learning at scale has long been a challenging ambition (again an issue not always acknowledged by contemporary disruptors). Although her focus is American education the story is relevant to the design and implementation of technology in similar school systems.

The scope of the book is confined to key figures and projects in the quest to develop mechanical tools for school learning and teaching covering the years from about 1920 to 1970. The technology under discussion is literally boxes containing gears, rolls of paper or similar media, occasionally electric lights (and in an early version even a dispenser offering chocolate bars), all controlled by simple levers and keys for user input. Although now superseded, many of the main features of these pre-digital devices remain familiar today in much educational software:

    • to present a ‘unit’ of content, usually offering a question or task;
    • to provide a means for a learner to respond;
    • to provide feedback on the response (ideally immediate);
    • to move to a new ‘unit’, or repeat the current one, based on that feedback.

A central figure running through this story is B.F. Skinner, not so much the originator of Behaviourist psychology (its origins trace back to the early 1900s), as its post-war bête-noire doggedly promoting his mechanical teaching machine as a device to change the manner and quality of school-based learning. The story of Skinner’s machines covers a surprisingly short period from about 1949 to about 1969, with antecedents in the 1930s. However, by the end of this period there was limited tangible effect on education beyond a few successful if not entirely rigorous trials.

Although today mechanical teaching machines tend to be known as ‘Skinner machines’, he was not the inventor of the concept. First World War recruitment revealed low levels of health and education in the population. Then, as now, education was perceived to be poorly managed with overworked and ineffective teachers. At war’s end a new emphasis was placed on testing for attributes such as intelligence or retention of learning.  A key figure here was Sidney Pressey and his ‘automatic teaching machine’ that first appeared in 1923. Pressey foresaw an “industrial revolution” in an education system he regarded as stuck at the “crude handicraft stage”. However, like Skinner after the Second World War, he experienced frustration with efforts to develop his device as a commercial enterprise (though he was unfortunate  to be doing this in the midst of the Great Depression). 

Skinner’s own early work at Harvard in the 1930s focused on developing the idea of behavioural conditioning using devices he made known famously as “Skinner boxes”. With these he trained pigeons to get food when the correct lever was pressed. His insight that these techniques might be applied to school learning came in 1953 when he visited his daughter’s fourth grade class (the top-end of UK Key Stage 3). As Skinner told the story he was shocked to see that the classroom teaching of basic arithmetic failed to meet what he regarded as the minimum conditions for learning, namely progression matched to ability (i.e. ‘stimuli’ in behaviourist terms) and timely feedback (for the pigeons an edible reward). Good students were held back, he observed, while those who needed help could not keep up. “The teacher”, he declared, “is out of date” and cannot provide adequate and timely feedback (i.e. reinforcement) to many children at once. Echoing Pressey (who he met and corresponded with during these post-war years), Skinner also declared that an “industrial revolution in education” is needed.

Interestingly, even as Skinner promoted his approach to Behaviourism as the true science of learning a paradigm shift was already beginning to take place that would challenge this viewpoint. Cognitive Science began its emergence as the new science of psychology and human learning (an early conference was organised by Jerome Bruner in 1959) and the term ‘Artificial Intelligence’ as a framework for understanding human thinking was adopted by a community of scientists and engineers at the now notorious Dartmouth College conference in 1965.

There is much in this book  to interest students of the history and origins of contemporary education technology as well as commentators on the current scene. In 1958 Sputnik stunned American national pride in its cultural and scientific prowess leading to many reforms. In education a new focus was brought to the teaching and learning of critical subjects as mathematics and science. While this gave a boost to the idea of teaching machines many problems about their design and usefulness remained to be solved. Content was a major issue and the field of Programmed Instruction (PI) grew in importance as more work was done to enhance the quality of both what was ‘taught’ by machine and how it was taught through sequence and structure.

It is a familiar story to contemporary readers that efforts to verify the value of this new technology through large-scale school-based investigations were largely ineffective or that by contrast projects by publishers to produce cheaper textbooks designed on PI principles (what today we might call branching texts) were relatively successful. Like Pressey before him Skinner had chronic difficulties in persuading his manufacturing partners to produce machines to a quality benchmark that satisfied him. Even so, it was hard to compete because his machines still lacked content. He later entered into partnerships with encyclopedia publishers because their established door-to-door sales model helped to forge a strong link with the idea of home-based learning and ‘free’ teaching machines were offered with every encyclopedia sale.

However, these efforts did little to enlarge the market or make teaching machines more acceptable to educators. In schools, the familiar issues of staff training, machine reliability and the scarcity of curriculum content dogged the enterprise. Moreover, by the turn of the decade in 1960-70 new ideas about pedagogy were emerging. This included a backlash against Behaviourism and increased dissatisfaction with post-war efforts to reform the teaching of mathematics and science. But Watters argues that  teaching machines did not simply die out; they were absorbed:

“… many of the key figures in the teaching machine movement did not suddenly stop working in teaching or training when the focus turned to computer-based education. Many of the ideas that propelled programmed instruction persisted and spread into new practices and new technologies.” (249)

Here she makes a case for the idea that behaviourism and its core concept of conditioning did not disappear from the mainstream of educational technology, as its most articulate critics might argue, but continues to inform the design aims of present-day educational technology. Indeed, it is fundamental to the massively successful “industrial” character of our new digital technology culture for it too relies on sophisticated techniques of “behavioural engineering” to actively nudge (or push) our preferences, desires, ideas and opinions towards ends that we often serve unwittingly. This is an important point of view and one that needs careful consideration.

It’s a case of: “Behaviourism is dead! Long Live Behaviourism!”

If there are limitations to this valuable critique of teaching machines it is, perhaps, that the idea of personalised learning, the subtitle of this book, does not get quite enough detailed critical attention that the reader might expect. We are also left unsure about the details of programmed instruction methods using the ‘old’ media of paper rolls, punched card discs, projected slides and so forth so that we might ponder how much of that early work has been carried forward. There is also a lack of illustrations leaving the reader to somehow imagine the various machines described in the text but which are nowadays quite unfamiliar. Perhaps MIT Press could have included more. Audrey Watters has curated a good collection of images and technical descriptions on her blog (for example see this page) and more of that material could have been included to enhance this noteworthy book.

Review: Raspberry Pi400

Reference link to product: https://www.raspberrypi.org/products/raspberry-pi-400/?resellerType=home)

Raspberry Pi400

David Longman, 27/11/20

At less than £100 the Raspberry Pi400 provides a meaningful solution for the notorious ‘three body problem’ that confronts computer buyers – to balance the conflicting requirements of  functionality, performance and price. The Pi400 is quite astonishing and it is fantastic value.

The basic keyboard unit at £67 has no mouse, SD card or micro-HDMI to HDMI for a monitor but for just a bit extra at £93 you can get all that in one bundle plus a handy user guide (with the Raspberry Pi OS pre-installed to the SD card). On reflection that might have been a better deal because I had no spare SD card or HDMI lead but these are cheap enough to buy separately. The pre-installed software included with the Raspberry Pi OS is more than enough to provide anyone with an accessible computer environment and because it is free and/or open source it is readily maintained.

The Pi400 surely hits the mark for the everyday user. It is a compact PC and with the included software it provides more than sufficient performance for writing, emailing, web browsing or common data management tasks. This small but punchy device will meet all your needs. But also for me, a lifetime Windows user who is always interested to explore further ‘under the hood’, this is a great way to dive into the Linux OS with minimal hassle. Novice or advanced programmers too will be satisfied by the inclusion of Python and Scratch while tinkerers will enjoy the GPIO interface allowing the Pi400 to interact with the physical world using many widely available, and cheap, connectables.

Thus, for a relatively small outlay you can have a high end PC sitting modestly on your desk and, if you don’t already have one, a reasonable monitor can be bought for as little as £80 (or, if your TV has an HDMI port, use that instead unless you have to argue with your housemates for TV time!). The Pi400 could readily support most home working needs, or it could be a tool of choice for students following a computer science curriculum, and a small start-up business might comfortably fulfil its basic IT requirements (along with a reputable cloud storage provider) without consuming significant capital resources.

What’s not to like?

 

The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child

The Charisma Machine: The Life, Death, and Legacy of One Laptop per ChildMorgan G. Ames

Publisher: MIT Press

Published: October 2019

Review date: 4th April 2020

bDavid Longman, MirandaNet

Teachers who work with and think about the role of computing in education either as a tool or as an object worthy of study will find this is an important and interesting book. For some (including the author of this review) the idea that programming a computer could be a novel way to learn about important ideas in mathematics, language, physics or, a little recursively, computing has been a captivating way to think about the educational value of computers. This was particularly so when personal computers were becoming affordable for individuals and, with funding, for schools.

Such was the influence of Logo and the enormously creative work of Seymour Papert, Cynthia Solomon and many others at MIT during the 1960s and 1970s. Embedded in the artificial intelligence culture of the day, programming was viewed as a representational tool through which the mysteries of human thought could be unravelled. Hence Papert’s slogan, one of many, that learning through programming is a process of ‘thinking about thinking’.

We live in a different world today, but at that time when the power of computing was moving beyond its origins in electronics, mathematics and engineering into wider cultural arenas, there seemed no limit to the possibilities opened up by computation. In education, Seymour Papert became something of a prophet for this new tool that could bring a new perspective to ongoing philosophical and political debates about the purpose and control of education, especially at the school level.

The computer, it was said, could enable children to be free of a deterministic style of schooling that reduced learning to a prescribed sequence leading to predetermined outcomes regardless of individual potential or preference. School, in other words, was seen by many as a factory system steering children towards things that others had decided they should know and understand. Computers and programming, on the other hand, offered the potential for self-fulfilment through expressive thought experiments which reduced, if not eliminated, the need for instructional teaching. Papert and his milieu painted the computer as a tool of liberation and intellectual freedom through a philosophy of learning he termed ‘constructionism’.

Needless to say, it is the seductive ideas of liberation and freedom that is a key source of the charismatic power of constructionism. In The Charisma Machine, Morgan G. Ames presents a fascinating case study of a major effort to implement constructionist learning with computers at scale. Starting in 2007 she researched the project, including conducting ethnographic field work in Paraguay, following the experiences of participants and planners who were included in a pioneering implementation of the One Laptop Per Child (OLPC) project. The brain-child of Nicholas Negroponte, a like-minded enthusiast and colleague of Papert and a co-founder of the MIT Media Lab in 1985, the OLPC project was a significant experimental project for constructionist learning. The ‘charisma machine’ of the book’s title is the XO laptop which was created specifically for the OLPC project and drew on the intellectual and cultural background at MIT which informed its seductive rhetoric and ambitious educational goals.

The XO was intended to be a personal machine owned by each participating student in the project, for their use. Each machine included pre-loaded open-source software designed on constructionist principles to exemplify the core philosophy behind OLPC. Thus, it included an early version of Scratch, Turtle Art, and some other useful tools. The XO was also WiFi enabled and included a browser. Alongside the fieldwork looking at the use of the XO in school settings, Ames also explores a ‘historical anthropology’ of the intellectual culture at MIT. While the fieldwork illustrates a significant mismatch between the intentions of OLPC and the everyday uses to which the XO was put, her exploration of the cultural assumptions of the project is equally revealing and helps to explain many of the project’s shortcomings. 

Some of the weaknesses of OLPC in Paraguay are by now familiar because they apply to almost any school trying to integrate laptop use into students’ lives. A lack of supporting infrastructure, particularly those in rural areas, for battery charging and for WiFi combined with the overstated robustness of the XO undermined the one-to-one use of the machines. Thus in classroom settings the XO had often to be shared, sometimes among three or more students, thus limiting the intended usefulness of the device as a personal machine, or teachers would have to design a curriculum that students could do either on paper or with a laptop. This, combined with a lack of professional development and training, left many teachers poorly prepared to work with the constructionist style of learning that ran counter to the more structured expectations of the school curriculum.

Urban schools were better placed but even here the expectation of a constructionist model of learning where kids would learn about computing through their own tinkering with software was thwarted by the overwhelming preference of the students for downloading music and videos from the internet. The centrality of the more entertainment-focused aspects of the world wide web as a source of cultural interest seems to have been thoroughly, perhaps mistakenly, underestimated by the OLPC project designers. This was almost certainly related to the strength of the constructionist convictions on the part of the project designers that children would naturally indulge their curiosity and explore ways to make the computer do interesting things.

The exploration of the assumptions of the project designers forms an valuable and fascinating aspect of this book and Ames shows convincingly that an implicit cultural hubris had a deep effect on the OLPC’s design. The notion of constructionism as conceived by the MIT community rested, Ames argues, on a gendered view of the idealised mind-set of the constructionist learner, namely the “technically precocious boy”. This image of the socially isolated boy, tinkering away in his bedroom on fascinating projects independently of the formalised schooling that he is obliged to endure undoubtedly pervades the cultural history of Silicon Valley where the household garage or bedroom den was a frequent site of technological invention and insight. However, the two main facets of this story – the socially isolated boy and the curiosity-crushing nature of the school curriculum – turn out to be almost entirely untrue. Almost without exception all the great minds inhabiting MIT (mostly men) experienced expensive and lengthy formal education. While many also tinkered none were lone geniuses.

This sexist myth clearly influenced OLPC projects on the ground. In one telling example Ames describes how, as the project developed, the project leaders searched for examples of constructionist learners who were using their XO to unpick the power of computation for learning. Few were found but one boy and one girl, in particular, were identified as concrete examples of the reality of constructionism at work. Yet when it was time to promote this positive effect it was the boy who was selected to travel to MIT and participate in an OLPC conference where he was, perhaps, regarded as a specimen of constructionism’s success.

This timely, well written  book will be of interest to anyone working in the field of educational technology. Even for readers who have never fully subscribed to constructionist learning there is much here to ponder about how various myths and assumptions can influence our actions in the field of educational technology. The values that shaped the intellectual and social culture of the OLPC project remain influential to this day. On the positive side, constructionism as a model of teaching and learning remains active. However, sexist prejudice continues to be rife throughout the technology industry and cultural imperialism coupled with billionaire libertarianism continues to taint the force for good that so many global platforms, including OLPC at its inception, see themselves as representing.

Ames’ analysis does not provide a predictive tool. While she shows that the same charismatic stories do tend to recur, we cannot say in advance how charisma will appear and then seduce us nor the particular forms that persuasive rhetoric may take. The importance of this book is to alert us to how ideas about technology are situated in a context of assumptions and prejudices. It is up to us to be more confident about putting forward a critique that helps to balance overstated aspirations if and when they appear. This may be especially relevant in today’s educational landscape where the new curriculum emphasis on acquiring knowledge about computer science fuels expectations as strong as those promoted by Papert and Negroponte.

Should Robots Replace Teachers? (v2)

Should Robots Replace Teachers?

Neil Selwyn

Publisher: Polity

ISBN: 78-1509528967

Published: Sep 2019

Review date: 20th October 2019

bDavid Longman, TPEA

The automation of professional expertise is at a tipping point and education may be the next to succumb. Machine learning (often loosely referred to as AI) lies at the heart of this transformation. It makes possible the automation of judgements that usually rely on teachers’ accumulated wisdom in both knowledge and relationships. Through teachers, students acquire an understanding of subjects and disciplines and they do so with the essential social and personal support that teachers can provide.

Neil Selwyn’s new book is a measured and accessible discussion about how new computational tools might change or diminish a teacher’s professional expertise in both knowledge and relationships. He argues that a wider critical debate about the impact of automation in crucial areas such as assessment, pastoral support, and content teaching is overdue for “ it is worrying that … [it] is not already provoking great consternation and debate throughout education”.

While the traditional boundaries of the management and organisation of higher education have been opened to the influence and investment of many external agencies and entrepreneurs, the pace of change may still be regarded by many as too slow. Change is needed but it can leave the education profession vulnerable to bad ideas as well as good. Unless we pay attention, the automation of teaching could lead to the diminishment of teachers and student learning everywhere.

For example, in higher education ‘intelligent agents’ (aka chatbots) are already making their mark in responding conversationally to student enquiries. Purportedly, chatbots improve student engagement and motivation (including offers of counselling support) while freeing tutors or administrators from supposedly burdensome FAQs about course content and related issues. Time saving is often a key marketing pitch but the effectiveness of these software machines is less well understood and time-saving may be illusory.

The use of Turnitin, a widely used system for detecting plagiarism in student assignments, may seem on the face of it a boon to assessment quality. However, Turnitin also illustrates the potential risks associated with such automation. First, all student writing is treated as a potential fraud and this undermines the crucial bond of trust between teacher and student. Second, Turnitin’s ongoing development is bringing us to a tipping point where, by applying machine learning to the recognition of a student’s writing style, the value of an educator’s expertise in evaluating this critical aspect of learning is diminished. Such a tool may (or may not) lead to improvements in plagiarism detection but it also represents a first step in the automation of academic judgement that can subvert a key element of academic expertise.

Selwyn has little to say about how these technologies find their way into our classrooms, workshops and lecture halls. What kind of policy-making processes drive their implementation? The usual forms seem to have given way to influential organisational and corporate networks such as Apple, Microsoft or Google, all capable of working at scale. These ‘fast policy’ networks are able to influence practice directly with tempting technologies and considerable investment.

Selwyn’s book is timely. The extraordinarily rapid emergence of influential AI-based technologies in higher education should generate significant debate and help us to keep ahead of the machines:

“… debates about AI and education need to move on from concerns over getting AI to work like a human teacher. The question, instead, should be about distinctly non-human forms of AI-driven technologies that could be imagined planned and created for educational purposes.” (127-8)

While it is not clear what such “distinctly non-human forms” might look like or be capable of it is an important idea. Teachers need to work together with machines “…on their own terms…” to improve the quality of education. This is not a replacement but a partnership that preserves and amplifies the important qualities of human teachers. Above all, for this partnership to work, educators must ensure that they have a clear and articulate voice that guides the changing landscape of professional practice with technology.

Should Robots Replace Teachers? (v1)

Should Robots Replace Teachers?

Neil Selwyn

Publisher: Polity

ISBN: 78-1509528967

Published: Sep 2019

Reviewed: 10 Oct 2019

bDavid Longman, TPEA

Neil Selwyn is a well known academic who has written several careful books on education technology. Once again, with his new book “Should Robots Replace Teachers?” he provides a clearly written, accessible book that can lead the time-limited educator and teacher to consider key issues and the questions raised by a new generation of emerging education technologies built around techniques of machine learning and artificial intelligence.

It is a lively time to be involved in teaching and learning. What are effective ways to teach and for students to learn in a world that, by all accounts, is changing dramatically and, for some observers, an education system that is not well adapted to the times? Intense debate on these broad issues often prevails, as it should, and Selwyn’s book ought to encourage well-informed contributions about many of the significant features of the rapidly evolving education technology environment.

Over the last decade in the UK there has been a noticeable shift away from conventional information technology in education. In the school curriculum there is now a stronger and more central role for computer science as a taught subject (though this development has its critics), and across all phases of education new kinds of computational tools are finding their way into classrooms and lecture halls. This new generation of tools aspire to provide active and rich cognitive support for student learning and aim to provide useful automation for at least some aspect of a teacher’s job (usually in the name of easing workload). Education technology is becoming less passive than it used to be when it did nothing until someone used it for some purpose. These new forms of education technology have the potential to become active agents in the relationship between teaching and learning, guiding, diagnosing and providing feedback on progress to teacher and learners.

And there lies the challenge! This book is for teachers and educators who want to be equipped with a critical understanding of the looming transformation of education that new AI-based digital technologies intend to bring to classrooms, workshops and lecture halls. Selwyn, ever constructive but critical, notes rightly, that:

“… it is worrying that the growing presence of AI in classrooms is not already provoking great consternation and debate throughout education.” (25)

 “Despite the concerns raised in this book, AI in education is still not seen as a particularly contentious issue amidst broader debates around education.” (119)

There are many ways to explain this lack of concern: topic fatigue (we’re all a bit tired of hearing about education technology); information overload (there’s just too much information to absorb or understand); private sector promotion (maybe it’s all just overblown marketing hype); or expert endorsement (academics and technologists in the know say it’s great so why argue). Perhaps, too, we simply remain wedded to a belief in the essentially human nature of teaching so that, in our minds, teaching lies beyond significant automation and therefore there is minimal risk.

But these are precisely the reasons why educators should be critical and this book will provide a strong foundation for considering the implications of this new generation of education technologies but without submerging the reader in technical detail. 

Responses to the questions raised by Selwyn are urgently required. The social and political context of education is both dynamic and, in the UK at least, more contested perhaps than it has been for at least three decades. The traditional boundaries of management, ownership and organisation of education have been opened up to the influence and investment of external agencies and entrepreneurs of all kinds. The free-for-all of digital capitalism is seeking new horizons to exploit while the changing status of teachers has left the profession vulnerable to the influences of politically and commercially motivated marketing.

Selwyn can only point us in the right direction, to equip us to ask good questions for which there may be no ‘right’ answers.. The important goal is provoke debate: “… there is plenty of reason to expect the increased AI-driven automation of teaching to lead to the diminishment of teachers, teaching and education.” (121).  This book, says Selwyn, “…is best seen as a provocation … “(131) and it takes us up to the important stage of helping to make it clear what the issues are. It raises “…a host of informed and pointed questions…” and is “…an important first step in achieving meaningful and sustainable change.” (132). These claims for the book’s purpose are justified and it is exactly what the book achieves.

Education technology is fast becoming a more active agent in the relationship between teachers and learners. It is no longer a matter of knowing how to curate or limit one’s ‘digital footprint’ (vital though this is) because data extraction and automated interpretation have become so powerfu. Software machines now engage in various forms of predictive modelling, extrapolating from what is already known about teachers or students to recommendations and judgements about where they should be in the future. Of course, this is what teachers do on a daily basis but that machines might do this automatically should concern every educator: the primary teacher enthusiastically using apps like ClassDojo, a secondary school teacher seeking more effective oversight of pupil behaviour with AS Tracking, a college lecturer aiming to personalise learning with tools like Knewton or a university professor lecturer concerned about student plagiarism

Overall, the book brings and order and clarity to the substantial arguments and questions about the purpose, value and efficacy of AI-based education technology. Chapter 2 is the only one that examines actual robots, i.e. devices that in whole or in part resemble a human form and behaviour. These are either programmable, (their value for curriculum learning is in the process of creating behaviours through coding), or they arrive ‘out of the box’ pre-programmed to respond interactively with children and students. While such devices are perhaps not so common in classrooms the chapter raises profound issues about the role of robot ‘companions’, particularly when these devices purport to offer emotional support to needy individuals and/or guide a teacher’s attention towards those individuals it has identified.

These themes of bot-like behaviour pervade the entire book. Chapter Four discusses intelligent tutoring systems and personal assistants which are simply more abstract, less obviously humanoid software systems that nevertheless engage with children and students in various forms of emulated dialogue that relate to a curriculum. Such tools are by no means new and prototypes began to appear in the early 1960s. Today they have become both more powerful in terms of software and much cheaper to implement using everyday kit such as PCs, tablets and smartphones all connected to ubiquitous cloud infrastructures provided by all commercial providers of such systems.

In spite of the lofty claims that are sometimes made for these tools, they are not much more than what was once known as computer-assisted instruction or programmed learning, being built typically on a coached instruction model of one-to-one teaching. Where they differ is in the use they make of the data they absorb from interactions with students based on myriad data signals. These data, used to guide and structure the learning pathways a learner might follow, can include many types of biometric data to infer a range of more subtle personality features such as motivation, attitude or emotional states.

This is entering new territory where algorithms attempt to characterise the state of mind of learners beyond their local position in a sequence of curriculum content. In turn, this can lead to decisions about the attainment, capability or disposition of the student and may invoke actions such as repetition of material, testing and assessment, promotion to the next stage of curriculum content or, in the name of diagnostic assessment, flagging problematic issues that may require alternative interventions.

However and thankfully, as Selwyn correctly notes, teaching is not simply a matter of directing learners what to do next. It also involves explanations and reasoning which, at their present state of development, are probably incapable of providing these, if indeed they will ever be capable of doing so. For as Selwyn also reminds us, the role of the teacher’s personality and the performative, body-centred character of much good teaching is presently well beyond the capabilities of such technologies.

Moreover, and importantly for those concerned about ethics and privacy, these data are inevitably captured  and integrated into cloud-based collective representations of learner behaviour. As will be well-known to readers of this review, the social and political issues surrounding the gathering of personal data into private ownership in pursuit of economic gain are some of the most toxic in public discourse today. Selwyn acknowledges these issues indirectly and they cannot be ignored:

“The suggestion of intelligent tutoring being rolled out across education systems needs to be taken seriously. if we are going to allow ourselves to have a learning companion for life then we need to think carefully about what we are letting ourselves in for.” (75) 

The book ends with many questions but no straightforward answers; just plenty to think about. It provides an agenda that should frame any discussions about how computational technologies might be deployed for teaching and learning. Is the continuous monitoring and ‘nudging’ of learner behaviour the right way to go? How does this change our definitions of teaching? If AI is pragmatically ‘better’ at certain types of activity are we confident that the data it uses to generate its outputs are useful and accurate?

Above all, are we exchanging the technically smart for the socially stupid? Today, these new technologies are not usually transparent in the sense that the basis for judgements about attainment or capability can be explained clearly. Teachers are people who have learned what they know, so they also know something about how to learn it and have empathy with others doing the same. Teachers are, by definition, social beings, and they can use the whole range of social activity from thinking aloud to bodily performance to enable and encourage effective learning. They can compromise, negotiate, be spontaneous, or deviate when necessary. None of these important qualities are yet possible for advanced computational tools but their absence must qualify discussions about when and how to use them.

If there is an omission in Selwyn’s book he has nothing to say about how these technologies find their way into classrooms, workshops, studios or lecture halls. What kind of policy-making processes drive their implementation? According to more recent work on this topic (e.g. by Ben Williamson or Stephen Ball and his colleagues), we are no longer steered by traditional forms of policy making at governmental level and instead have entered an era of a type of policy-making-on-the-go driven by non-governmental policy networks comprising organisations with strong vested interests. We have simply to observe the prominent role of influential corporations such as Apple, Microsoft or Google in fostering the use of powerful cloud supported devices in schools,

Moreover, these actors are not only extraordinary developers and manufacturers of computational technologies but they are also extraordinarily powerful lobbyists on behalf of the idea of the ‘robotisation’ of education. It ought to be clear to the concerned professional that today the real players in framing, selling and implementing the education technology agenda are those same ‘surveillance capitalists’ that Shoshana Zuboff has discussed at length.

However, Selwyn suggests there could be a way forward in the professional race to keep ahead of the machines, and this is his final provocative thought but one that should keep us busy:

“Public policy and professional debates about AI and education need to move on from concerns over getting AI to work like a human teacher. the question, instead, should be about distinctly non-human forms of AI-driven technologies that could be imagined planned and created for educational purposes.” (127-8)

It is not clear what such “distinctly non-human forms” might look like or be capable of, but it is an important idea. For, as he also writes, “… it is crucial that teachers work together with machines on their own terms … in ways that …improve the quality of and the nature of the education that results.”(126)

These two provocations work together. They speak of partnership rather than replacement, a partnership that both preserves and amplifies the important qualities of the human being without attempting to replace or automate them. Above all for this partnership to work, educational professionals must strive to ensure that they have a clear and loud voice in the changed landscape of policy making.

The Wall

The Wall

The Wall by John Lanchester

(Longlisted Booker 2019)

This synopsis contains minor spoilers.

One long metaphor, a dystopia for our times with no get out clause. While the main characters find safety at the end it is temporary, a momentary pause in a continuing struggle. This book won’t cheer you up! A well-written story and well put together although I am slightly surprised it has been longlisted for the 2019 Booker. Do not look too deep. The metaphor does not bear much analysis for it works on the level it presents, an oblique, symbolic look at our contemporary world.

Perhaps not so much a metaphor as a pastiche of that residual cultural memory of war-time Britain, so beloved of our British establishment yet a time experienced by relatively few alive today. It is a psychic memory that perversely fuels the mad fervency of our present jingoistic perception of Britain as a moated nation under siege. A recognisable Britain but one that smells of tea, trains and rations.

The story opens on the Ifracombe stretch of the Wall at some indeterminate time in the near future when catastrophic sea level rise has flooded low-lying and coastal areas of the world. The Wall is a vast infrastructure of concrete and bulk, of loneliness and boredom, dwarfing anything yet built, surrounding Britain. Guarded by Defenders against the Others who, fleeing their own unviable lands, come from the sea to climb the Wall into the security of what remains of a surviving society.

There is no longer a coastline as we know it – no beaches, no sunny coves, only the great cliff formed by the Wall that, as thoroughly as it can be, is built up to the sea’s edge. Coastal villages and towns. once next to the beach, now reside in its shadow. The Wall has replaced the tidal edge a bulwark against the rising sea and the threat from Others. The sea can only be seen from inland, from higher ground. It is a drowned world.

Britain is a nation under siege from desperate others coming from places we know little about but which cannot be endured. No-one really wants to know or cares who they are. They are simply Others and must be repelled. In their turn they are ruthless. The Defenders are a conscripted, armed force, compelled to monotonous, arduous duty guarding the Wall on its ramparts. They live a hard life, living in spare comfort with few diversions faced with an unforgiving duty cycle. The technology is not sophisticated – there are no lasers here just bullets, grenades and bayonets.

Discipline is severe and the consequences of failure grim. If Others succeed in getting over the Wall, breaching the Defenders and entering Britain then the same number of Defenders, those deemed responsible for the defensive failure, are themselves cast out to sea in an open boat with minimal provisions. Their punishment is to become, in their turn, Others. No justice, no mitigation, just a simple tit-for-tat. If twenty Others make it then twenty Defenders must be banished and if there aren’t twenty Defenders left to be banished (battles can be severed and mortal) then the numbers are made up from next-in-line officers and officials.

Banishment entails the painful removal of the embedded identity chips that every citizen carries inside them. Without it, a person becomes an Other, a nothing. It is what illicit Others aim for if they get over the Wall. to acquire a forged chip to enable a new life inside the Wall. British life is bleak and grey resembling perhaps old movies of war-time life – crowded trains, wary civilians, unhappy homes. The scenario seems clear. This is Britain today (2019) – under siege, threatened by insidious invasion and paranoid about internal conspiracies to aid the Others. It is bleak. The only options seem to be to somehow join the ‘elite’ (they fly high in the sky in planes back and forth between who knows where), to become a Breeder to make more children to join the ranks of the Defenders, or to make a living as a member of the Help class, a form of indentured labour only one step from slavery.

Spoiler alert: The protagonist and narrator of the story, Kavanagh, suffers the fate of banishment, having begun the journey towards Breeder status in his bleak ambition to lift his status . His banishment is the result of a real conspiracy-from-within led by a respected officer, himself once an Other, and which enables more Others to climb over the Wall and enter Britain. Banished to the open sea with a group of other exiles there are great dangers – pirates, and a lack of navigation – and forlorn hope – the raft colony they come upon is reminiscent of Kevin Costner’s movie Waterworld) but it is soon destroyed. Eventually the Kavanagh Breeder partner find some safety but it is inconclusive, a respite, temporary at best. They are doomed to continued exile.

Note: Walls are a fairly common literary image. Some examples are here: A Brief Review of Walls in Literature by Tom Mitchell – “Fear of invasion … is a more powerful form of control than bricks and mortar.” and more recently, Game of Thrones features a defensive Wall of ice.

Note: 12 May 2020

Well, well! This article asks: “As sea levels rise, are we ready to live behind giant walls?