Skip to main content

How do we make sure things in FE don't just "get worse"?

19 March 2026

By Jamie Pearson, Senior Lecturer in Education at Darlington College and Research Further Scholar

As with many of you, I spent much of Christmas Day lounging on the sofa, reading my new book, while the children played with their new toys and the BBC broadcast another Julia Donaldson adaptation in the background. My book, a gift I received for Christmas, was Enshitification: why everything suddenly got worse and what we can do about it. In it, internet activist and journalist, Cory Doctorow, explains how and why the much loved internet platforms that we rely on are getting worse and how he coined the term ‘enshitification’ to describe the decline.

That term isn't simply a description; it's an explanation of the deliberate process of making stuff worse. Doctorow outlines a three-step-process. First, platforms are good to their users; second, they pivot to serving their business customers; finally, they extract value for themselves, intentionally eroding the experience for everyone. The process shows how something useful becomes something extractive.

What concerns me and what should concern you too, is that the platforms we use in FE, have sprouted new features and new platforms aligned with the current AI hype. Artificial Intelligence is the current buzzword in education and it seems like everyone is jumping aboard the money train, tagging AI onto every product and process. This is not to villainise AI-based platforms, which do indeed offer some genuinely helpful features that may well support teachers and reduce workload. Rather it is to send up a distress flare: a warning against what might happen to teacher professional knowledge if these platforms are adopted uncritically.

TeacherMatic, for example, arrived in 2023 with the promise of helping teachers better manage and reduce their workload by automating tasks such as lesson planning. However, this creates an arbitrary separation between teaching and planning, as if planning were not a central component of teaching itself. It allows us to mentally dichotomise planning and doing and tells us that AI will take care of the troublesome and undesirable planning thing.

However, planning should be thought of as an abstract thinking and decision making process where teachers use professional knowledge to make professional judgements about what, how and why they teach (Zaragoza et al., 2023). Far from being a bureaucratic administrative task, lesson planning is a site where teacher judgement and professional knowledge are enacted and contextualised. Therefore, allowing our professional judgement and pedagogical decision making to be easily offloaded, moreover surrendered, despite the short-term benefits, risks the long-term erosion of expertise.

Workload pressures in FE are real. NFER (2024) found that many FE teachers considered their workload to be excessive and centred on administrative tasks, a problem that my fellow Research Further Scholar, Katie Stafford, regards as a ‘wicked problem’ (2024) and almost impossible to solve. But while some teachers may well consider lesson planning a time-consuming and bureaucratic task, the Education Endowment Foundation (EEF) found that when interviewed, teachers actually identified lesson planning as a task they would not want to significantly reduce as they consider it a central part of their professional responsibility (2023). In other words, while planning is time-consuming, teachers recognise its importance.

The problem, then, is not that teachers want to avoid planning, but that they are all too often victims of a managerialist culture of audit and performativity. Too much time is spent proving they're doing their work, rather than actually doing it. In such a context it's obvious to see how AI-generated lesson planning becomes attractive. Teachers don’t want to surrender professional judgement, but when they’re trying to survive within a system demanding productivity, lesson planning becomes the easiest task to offload.

Of course, none of this is new, critical pedagogy theorists have long argued that neoliberalism reshapes education by aligning it with market-based values. However, enshitification isn't just simply neoliberalism. The rise of the platforms has precipitated what Yanis Varoufakis calls technofeudalism, where we increasingly subscribe to (or rent) the services we used to own, be it software (Office365), music (Spotify) or books (Audible). GenAI now extends this to cognitive support. My worry is that GenAI tools will follow the same pattern as the other platforms, first be useful to their customers, then pivot to their business customers, before finally deciding to extract maximum profit for their shareholders.

The consequence in teaching starts as cognitive offloading by deferring subject specific and pedagogical decision making to large language models (LLMs). At first it's helpful; the pressure of your workload eases, and youvfind your pedagogy shifting, even improving. But over time, this becomes cognitive surrender, you are no longer able to plan and teach without AI’s input because you've handed over your cognitive capacities and professional judgement has been outsourced.

Neil Postman calls this technopoly, when society surrenders judgement to technological imperatives and human capabilities gradually erode. In a technopoly, alternatives to itself are made invisible by redefining what we mean by art, privacy and intelligence. AI makes us redefine intelligence so that our definitions fit its requirements. In FE, the erosion would mean a weakening of professional judgement, expertise and pedagogical flexibility; the enshitification of teaching itself.

In the book, Doctorow argues that regulation, privacy and self-help can reverse enshitification, but in FE resistance begins with a few more modest steps.

  1. We use GenAI tools as occasional aids for genuine administration duties, and resist/avoid reliance on it as a source of epistemic authority.
  2. We prioritise our own knowledge and learning. As a teacher, knowledge is the capital in which we trade. Reaffirm your belief in the necessity of knowledge.
  3. We continue to advocate for protected planning time. Planning isn't just a nice thing to do, it's essential to our craft, our judgment and our expertise.

Enshitification in Further Education is not then the worsening of our technology, but the quiet and gradual separation of knowing from doing and the erosion of professional knowledge; and once we lose it, it won't be easily reclaimed.