Subj : AI in education isnt a crisis, its an indictment of the whole thi To : All From : TechnologyDaily Date : Sat Feb 03 2024 10:30:05 AI in education isnt a crisis, its an indictment of the whole thing as a means to an end Date: Sat, 03 Feb 2024 10:28:19 +0000 Description: In a new study published this week, over half of UK undergraduates are using artificial intelligence to help write essays. Does this implicate higher education and its methods more so than students? FULL STORY ====================================================================== Over 1,000 UK undergraduates have been surveyed by the Higher Education Policy Institute (HEPI) and 53% admitted to also using popular AI tools like ChatGPT or its innumerable imitators to create content, generate ideas, or both. The Guardian phrases the next bit perfectly, so Im going to quote them on it: Just 5% admitted copying and pasting unedited AI-generated text into their assessments. Right, so maths isnt the strongest point of anyone jumping to a froth-at-mouth conclusion, but thats at minimum 50 students and almost definitely less than 100. True, this is one studys sample size, but its also a big one. This also isnt the first time studies like this have been done and prompted rethinks in how to secure academic integrity in the age of AI . But if AI can trample all over university degrees and courses, doesnt that mean that theyre no longer fit for purpose? Shouldnt educators adapt? Adapting to AI in education Well, they might be doing so. Im reading a Wired article (paywall) from just over a year ago at time of writing, and the understanding of AIs role in plagiarism is slightly depressing: a lot of equivocation over hmm, if a computer generated the content, is it a plagiar, and not really understanding that AI, as we know it in this context, is just a computer thats been force-fed a human-produced (and often itself copyright-infringing) corpus, not a literal sentient being. But the educators quoted in the Guardian article seem pretty switched on, try the following: Ive implemented a policy of having mature conversations with students about generative AI. They share with me how they utilise it, [Dr Andres Guadamuz, Intellectual Property Law Reader, University of Sussex] said. UK educators are also benefiting from the existence of AI. The Guardian writes that 58 secondary schools have been enlisted into a research project by the Education Endowment Foundation (EEF) wherein teachers will AI-generate lesson plans. The report says nothing of how lecturers are taking to it, but I think its likely that they are, given that members and representatives for the two main higher education unions in the UK, the University and College Union (UCU) and UNITE, have been locked in battle with universities over pay and working conditions since I was a student, and it looks like its about to kick off again . Anything to lighten the load. All of this sounds a hell of a lot more compassionate than hyping AI up to be the Antichrist, and threatening students with a stain on their academic record without any attempt to, er, educate students about what AI is or does. That, at least, seems to be the overarching tone of that old Wired piece, despite the anecdote from a real-life breathing student talking about how poor ChatGPT is at producing engaging, let alone informed, academic material, so they wouldn't use it anyway. Personal anecdote break I could get drummed out of the magic circle here, but officially, at Future PLC, TechRadar Pros parent company, Im a Graduate Junior Writer. My having gone to university, in a time before artificial intelligence, is basically the reason I get to register industrial-strength opinions that make no discernible difference to the way things are. Im also a pretty solid opponent of generative artificial intelligence. By and large, its a way of laundering copyright infringement, diluting the work of individuals, and making things up as it goes along to make a kind of tasty swiss cheese prose. Bad actors (including, er, the HEPI study) call this last one hallucination, but I think Im going to call it lying. Where written content generation is concerned, Future PLC investigates AI use and disciplines when uncovering plagiarism. Yet now I find myself in a strange predicament of not caring, about AI use? At least in the realm of education. (Image credit: Shutterstock) I dont care if students use AI to get a degree Enticing heading, but its not because Ive received a dark money payment in the last thirty seconds to make me now bang on about how AI is the future or whatever, its because AIs net good has been proving that the education system, and the way in which perceived by the working world is broken. We ran a story this week about how a majority of those young people they have now are struggling for job experience . Ive personally faced this. Even getting this graduate role was, I believe, more down to my relevant job experience, which I absolutely debased myself to get, than the actual piece of paper I got from my university for my tens of thousands of pounds and unceasing toil. Reading it incensed me, and reminded me of the following maxims, as decreed by civilisation. All of this to say, the university degree has become so worthless, yet such a prerequisite of modern working life, that not only do I not care about the most egregious uses of AI in higher education, Im actually somewhat saddened that the number of students engaging in that kind of use arent higher. AI use by students in assessments indicts university courses as being dull as dishwater, and too expensive for what they are, more than it does students for being hardened academic criminals. Some students dont test well, or learn differently, or are just here because, of course, you need a degree to get a job. That was a round peg in a square hole scenario even when higher education was more accessible, but now institutions are putting students in the same situation while also placing more financial constraints on them. Given this, I would therefore suggest either: a) just giving the student the piece of paper for Gods sake so they can get on with their life. b) starting to phase out you need a degree to work as a culturally embedded principle if you want people in work regardless, which you do c) Overhauling the assessment process such that it caters to multiple learning styles and dares to actually be interesting, which would also thwart the rise of artificial intelligence, or whatever. My experiences of how distinctly unbothered employers and educators alike are by degree content and structure leads me to believe that, if I were able to have used AI at university, my life would not have been changed in any meaningful way, other than vastly decreasing the sheer amount of spinal fluid wrung from me to get here. AI, like everything thats made it into the zeitgeist at the behest of a nebulous, financially-motivated actor, is nightmarish and a cesspit. However, the education system is also a nightmarish cesspit, and AI has helped reveal that. In this one particular scenario, education AI doesnt need regulation, its just doing what its supposed to: regurgitate and bluff back at you. If thats enough to qualify what undergraduates do anyway (Ive been there, it was, and it is), and thus short circuit higher education as we know it, then AI, for once, is not the problem, and the kids might actually be alright. Workable solutions do exist To be constructive in offering solutions more realistic than reverse decades of the commercialisation of higher education via legislation with more legislation, I do have some ideas. Start by taking the rot(e) out of how assessments are delivered in favour of a wider variety of projects, and focussing on course content and delivery methods so that students actually want to engage with the assessment material. I concede however, that this would still require ministers, secretaries, and university staff dutifully insistent on shooting themselves in the foot alike to admit that they are wrong. This sounds combative, but I should be fair. One senior figure in higher education who makes a solid argument along these lines is Professor Dilshad Sheikh, Deputy Pro-Vice Chancellor and Dean of the Faculty of Business at Arden University. She says that Arden, a blended and online higher education institution, is taking steps away from punishment to education when it comes to AI use. Arden University argues that instead of punishing students for using such technology in all circumstances or trying to train lecturers to notice the signs of AI-generated content, they should be teaching students how to use it to help enhance their work and processes. The university is, therefore, exploring how best to integrate AI into learning, teaching and assessment strategies, recognising that a positive pioneering approach to AI is more beneficial to students. Many other universities are focusing on plagiarism and how AI chatbots give students the opportunity to cheat on assignments. However, the reality is that the technology cannot replicate understanding and application of knowledge in authentic assessments, which is how we design our courses. The truth of the matter is that times are changing, so how and what we teach should change too. AI will continue to get smarter to make our lives easier. We are seeing more and more businesses embracing such technology for the betterment of their growth, so why should we punish our students for using the same software being used in the real world? AI and the real world This last point is pretty interesting, and one that I hadnt really considered until now. AI is being laundered into workplaces as a productivity tool, but its pitfalls are surely the same as in education, as Future PLC has seen. True, Ive made no secret that I dont use AI and take a pretty dim view of the whole thing. But using AI responsibly - for prompts, for ideas, rather than for content - and evangelising that kind of use in a learning environment, is perhaps making the best of a bad situation. And, evidently, small but vitally important moves are being made from all sides in the UKs higher education system to educate and engage critically with AIs unsuitability to produce excellent, insightful academic work, as well as push for change in how degrees are taught and thus re-engage students. Its a good sign that the student-university transaction, though still a transaction at all, mandated by many workplaces in this country at this time, could be about to become more valuable to students, the people who benefit the most from it. And then - who knows? We might just stop having to read about lecturers getting mad in national newspapers that their assessments not only can be passed by a computer literally making it up as it goes along, but that students are disengaged enough to prefer all of that of applying themselves. With higher education in the state its in, I still dont blame them. More from TechRadar Pro Amazon wants to train millions of people in basic AI skills Can AI transform personalized learning in schools? Microsoft is investing billions to bring AI to the UK ====================================================================== Link to news story: https://www.techradar.com/pro/ai-in-education-isnt-a-crisis-its-an-indictment- of-the-whole-thing-as-a-means-to-an-end --- Mystic BBS v1.12 A47 (Linux/64) * Origin: tqwNet Technology News (1337:1/100) .