- About Us
- About Association of Colleges
- AoC Governance
- AoC Regions
- AoC Charitable Trust
- AoC Sport
- Our Equality, Diversity & Inclusion Work
- Our Climate and Sustainability Work
- Our Work Across the Four Nations
- AoC National Chairs' Council
- Work for AoC
- About Colleges
- Corporate Services
- Data Protection/GDPR
Employment Services - college workforce
- Employment Services - college workforce
- Introduction & Employment Helpline
- Absence & Sickness Management
- Contracts and T&Cs
- Disciplinary, Capability & Grievance
- Employment Briefings Library
- Equality, Diversity & Inclusion
- General Employee Relations & HR Issues
- Industrial Relations
- ONS reclassification related guidance
- Pay & Pensions
- Redundancy, Restructuring & TUPE
- Workforce Benchmarking, Surveys & Research
- Get Involved!
- The 5Rs Approach to GCSE Maths Resits
- Apprenticeship Workforce Development (AWD) Programme
- Creating a Greener London – Sustainable Construction Skills
- Erasmus+ EXPECT Project
- Digital Roles Across Non-digital Industries
- T Level and T Level Foundation Year Provider Support Programme
- The Valuing Enrichment Project
- Higher and Extended Project Qualifications
- OfS - Higher Education Social Prescribing Project
- Pears Foundation Youth Social Action Programme: Phase 2
- Pears #Iwill Youth Social Action Apprenticeship Project
- T Level Professional Development (TLPD) Offer
- T Level Curriculum Macro-Sequencing
- Contact the Projects Team
- Sustainability & Climate Action Hub
- Honours Nomination
- Recruitment & Consultancy
- Events & Training
- Funding & Finance
- Meet the Policy Team
- Policy Areas
- Policy Briefings
- Policy Papers & Reports
- AoC Strategy Groups
AoC Reference Groups
- AoC Reference Groups
- Adults (inc. ESOL) Reference Group
- Apprenticeship Reference Group
- Technology Reference Group
- HE Reference Group
- 14-16 Reference Group
- Mental Health Reference Group
- 16-18 Reference Group
- SEND Reference Group
- WorldSkills Reference Group
- HR Reference Group
- Sustainability & Climate Change Reference Group
- EDI Reference Group
- Research Unit
News, Campaigns & Parliament
- News, Campaigns & Parliament
- Comms advice and resources for colleges
- Contact the Communications, Media, Marketing and Research Team
- AoC Newsroom
- AoC Blogs
- Work in Parliament
- AoC Campaigns
Love Our Colleges
- Love Our Colleges
- Colleges Week 2023
- Creative Writing in FE - Developing student voice through the written word
Artificial intelligence can take the grunt work out of admin - Alex Warner
by Alex Warner, Principal: Curriculum Innovation and Pedagogyat at MK College Group
Cards on the table, I have been described as a technophile, and as someone who enjoys engaging in a dose of futurology. Tempting as it is to offer a clickbait headline along the lines of, “The Robots are Coming,” I am optimistic about the future, and specifically the future of the influence of technology in the education sector. There is currently a lot of hype around artificial intelligence (AI) and automation technologies, but the vital thing to remember is that The Future is - in education at least - People.
By now most people will have heard of ChatGPT and its essay-writing skills. I was recently asked to produce a sixty word bio for a conference I was attending, and thought I’d ask the programme to do it for me. It took seconds. It was okay; it wasn’t as I would have written it, but it saved me probably the fifteen or twenty minutes required to do it myself. AI is functional, but not a perfect replacement for a real person. It can count, it can write, it can do many things which allow its human controllers to spend their time more fruitfully – that is to say, like a calculator or a cash till, it creates efficiencies.
Here's another example using Dall.e, an AI tool that creates pictures from text instruction. I asked it to design a “Further Education College of the Future” to coincide with a 2022 College of The Future Commission publication:
Pretty neat, but then I’m no architect. This does seem to me to be a bit more Jetsons and a bit less FE.
A number of more significant AI shortcomings have recently hit the headlines. For example, Google’s Bard wiped $100 billion from the value of the company’s shares with one mistake, while Microsoft’s Bing Chat caused some consternation when it appeared to be threatening humanity. In the latter instance Microsoft now limits responses to stop AI going quite so seriously off piste.
What we really need to be thinking about with regard to the use of AI generative tools in education boils down to bias and ethics. Firstly, bias. OpenAI uses large data scrapes (a technique in which a computer program extracts data from output generated from another program) to learn how to respond. Such data has the potential to include misogynistic, racist, sexist etc. language that it has learned. In terms of ethics, we will need robust processes to reduce the risk of plagiarism. Educators have successfully policed cheating for thousands of years and I have no doubt we will be capable of doing so for another thousand.
So, AI has limitations and how we use it requires some careful thought. However, it is here, and it’s not going away. New technology can appear scary, especially when our understanding of it is limited. For example, in 1862, The Lancet reported on the findings of a commission of inquiry into The Influence of Railway Travelling on Public Health. It declared that train travel induced, “headaches, deafness, incessant noise in the ears, sleeplessness, depression, numbness of the limbs, chilliness, softening of the brain, spinal softening, epileptic seizures, and apoplexy.” Fifty years earlier, some doctors suggested that the human frame could not withstand such speedy forms of transport and that passengers would die. It’s easy to laugh at such assumptions now, but these were not the fears of ignorant people, but of educated, serious-minded physicians. They just lacked information.
At the South Central Institute of Technology in Milton Keynes (SCIoT) we specialise in the teaching and learning of the skills and knowledge needed to make the most of new tools like AI. And be in no doubt, however intelligent AI becomes, it will always be a tool; something imagined and created by people to make their lives easier.
There are things AI will never be able to do. AI cannot feel or demonstrate empathy, kindness or compassion. By definition, it cannot replicate humanity. What it can do is undertake some mundane or time-consuming tasks more quickly than we can, leaving us to do the nuanced stuff at which humans will always be better. AI will lighten the load for some of us, and be barely noticeable for others.
For me, there will be no substitute for being greeted by a warm smile from a member of staff as you enter a further education setting. There are the feelings of being safe and secure when in the presence of security teams across our college estates. Then there is the learning support assistant who reassures a student that “everything is okay,” and who listens, empathises, and helps rebuild an individual’s confidence. There are the canteen staff who act discreetly when providing a warm meal to someone eligible for a free lunch. There are those student experience champions who bring culture and vibrancy to campuses across the country. Nobody would want to replace any of these interactions with a robotic arm around the shoulder.
In David Goodheart’s Head Hand Heart: The struggle for Dignity and Status in the 21st Century he discusses thought, craft and feeling. AI can support us with grunt work, but it will never have the necessary hand or heart to replace us.
The views expressed in Think Further publications do not necessarily reflect those of AoC or NCFE.