[HN Gopher] Implications of AI to schools
       ___________________________________________________________________
        
       Implications of AI to schools
        
       https://xcancel.com/karpathy/status/1993010584175141038
        
       Author : bilsbie
       Score  : 124 points
       Date   : 2025-11-24 17:51 UTC (5 hours ago)
        
 (HTM) web link (twitter.com)
 (TXT) w3m dump (twitter.com)
        
       | ekjhgkejhgk wrote:
       | In other words, learn to use the tool BUT keep your critical
       | thinking. Same with all new technologies.
       | 
       | I'm not minimizing Karpathy in any way, but this is obviously the
       | right way to do this.
        
       | trauco wrote:
       | This is the correct take. To contrast the Terance Tao piece from
       | earlier (https://news.ycombinator.com/item?id=46017972), AI
       | research tools are increasingly useful if you're a competent
       | researcher that can judge the output and detect BS. You can't,
       | however, _become_ a Terence Tao by asking AI to solve your
       | homework.
       | 
       | So, in learning environments we might not have an option but to
       | open the floodgates to AI use, but abandon most testing
       | techniques that are not, more or less, pen and paper, in-person.
       | Use AI as much as you want, but know that as a student you'll be
       | answering tests armed only with your brain.
       | 
       | I do pity English teachers that have relied on essays to grade
       | proficiency for hundreds of years. STEM fields has an easier way
       | through this.
        
         | wffurr wrote:
         | Yesterday's Doonesbury was on point here:
         | https://www.gocomics.com/doonesbury/2025/11/23
         | 
         | Andrej and Garry Trudeau are in agreement that "blue book
         | exams" (I.e. the teacher gives you a blank exam booklet,
         | traditionally blue) to fill out in person for the test, after
         | confiscating devices, is the only way to assess students
         | anymore.
         | 
         | My 7 year old hasn't figured out how to use any LLMs yet, but
         | I'm sure the day will come very soon. I hope his school
         | district is prepared. They recently instituted a district-wide
         | "no phones" policy, which is a good first step.
        
           | phantasmish wrote:
           | Blue book was the norm for exams in my social science and
           | humanities classes _way_ after every assignment was typed on
           | a computer (and probably a laptop, by that time) with
           | Internet access.
           | 
           | I guess high schools and junior highs will have to adopt
           | something similar, too. Better condition those wrists and
           | fingers, kids :-)
        
             | eitally wrote:
             | I'm oldish, but when I was in college in the late 90s we
             | typed a huge volume of homework (I was a history &
             | religious studies double major as an undergrad), but the
             | vast majority of our exams were blue books. There were
             | exceptions where the primary deliverable for the semester
             | was a lengthy research paper, but lots and lots of blue
             | books.
        
           | ecshafer wrote:
           | New York State recently banned phones state wide in schools.
        
           | nradov wrote:
           | Oh how I hated those as a student. Handwriting has always
           | been a slow and uncomfortable process for me. Yes, I tried
           | different techniques of printing and cursive as well as
           | better pens. Nothing helped. Typing on a keyboard is just so
           | much faster and more fluent.
           | 
           | It's a shame that some students will again be limited by how
           | fast they can get their thoughts down on a piece of paper.
           | This is such an artificial limitation and totally irrelevant
           | to real world work now.
        
             | wffurr wrote:
             | Maybe this is a niche for those low distraction writing
             | tools that pop up from time to time. Or a school managed
             | Chromebook that's locked to the exam page.
        
             | o11c wrote:
             | Obviously the solution is to bring back manual typewriters.
        
           | zahlman wrote:
           | > My 7 year old hasn't figured out how to use any LLMs yet,
           | but I'm sure the day will come very soon. I hope his school
           | district is prepared. They recently instituted a district-
           | wide "no phones" policy, which is a good first step.
           | 
           | This sounds as if you expect that it will become possible to
           | access an LLM in class without a phone or other similar
           | device. (Of course, using a laptop would be easily noticed.)
        
             | wffurr wrote:
             | The phone ban certainly helps make such usage noticeable in
             | class, but I'm not sure the academic structure is prepared
             | to go to in-person assessments only. The whole thread is
             | about homework / out of class work being useless now.
        
           | Fomite wrote:
           | In the process, we lose both the ability to accommodate
           | students, or ask questions that take longer than the test
           | period to answer.
           | 
           | All for a calculator that can lie.
        
         | A4ET8a8uTh0_v2 wrote:
         | It is, but it does not matter, because:
         | 
         | 1. Corporate interests want to sell product 2. Administrators
         | want a product they can use 3. Compliance people want a
         | checkbox they can check 4. Teachers want to be ablet to
         | continue what they have been doing thus far within the existing
         | ecosystem 5. Parents either don't know, don't care, or do, but
         | are unable to provide a viable alternative or, can and do
         | provide it
         | 
         | We have had this conversation ( although without AI component )
         | before. None of it is really secret. The question is really
         | what is the actual goal. Right now, in US, education is mostly
         | in name only -- unless you are involved ( which already means
         | you are taking steps to correct it ) or are in the right zip
         | code ( which is not a guarantee, but it makes your kids odds
         | better ).
        
       | ubj wrote:
       | One of my students recently came to me with an interesting
       | dilemma. His sister had written (without AI tools) an essay for
       | another class, and her teacher told her that an "AI detection
       | tool" had classified it as having been written by AI with "100%
       | confidence". He was going to give her a zero on the assignment.
       | 
       | Putting aside the ludicrous confidence score, the student's
       | question was: how could his sister convince the teacher she had
       | actually written the essay herself? My only suggestion was for
       | her to ask the teacher to sit down with her and have a 30-60
       | minute oral discussion on the essay so she could demonstrate she
       | in fact knew the material. It's a dilemma that an increasing
       | number of honest students will face, unfortunately.
        
         | vondur wrote:
         | I agree. Most campuses use a product called Turnitin, which was
         | originally designed to check for plagiarism. Now they claim it
         | can detect AI-generated content with about 80% accuracy, but I
         | don't think anyone here believes that.
        
           | phh wrote:
           | 80% is catastrophic though. In a classroom of 30 all honest
           | pupils, 6 will get a 0 mark because the software says its AI?
        
             | v9v wrote:
             | I suppose 80% means you don't give them a 0 mark because
             | the software says it's AI, you only do so if you have other
             | evidence reinforcing the possibility.
        
             | yoavm wrote:
             | The promise (not saying that it works) is probably that 20%
             | of people who cheated will not get caught. Not that 20% of
             | the work marked as AI is actually written by humans.
        
             | CaptainNegative wrote:
             | It depends on their test dataset. If the test set was
             | written 80% by AI and 20% by humans, a tool that labels
             | every essay as AI-written would have a reported accuracy of
             | 80%. That's why other metrics such as specificity and
             | sensitivity (among many others) are commonly reported as
             | well.
             | 
             | Just speaking in general here -- I don't know what specific
             | phrasing TurnItIn uses.
        
             | j45 wrote:
             | I think it means every time AI is used, it will detect it
             | 80% of the time. Not that 20% of the class will marked as
             | using AI.
        
             | kelseyfrog wrote:
             | 80% accuracy could mean 0 false negatives and 20% false
             | positives.
             | 
             | My point is that accuracy is a terrible metric here and
             | sensitivity, specificity tell us much more relevant
             | information to the task at hand. In that formulation, a
             | specificity < 1 is going to have false positives and it
             | isn't fair to those students to have to prove their
             | innocence.
        
               | soVeryTired wrote:
               | That's more like the false positive rate and false
               | negative rate.
               | 
               | If we're being literal, accuracy is (number correct
               | guesses) / (total number of guesses). Maybe the folks at
               | turnitin don't actually mean 'accuracy', but if they're
               | selling an AI/ML product they should at least know their
               | metrics.
        
           | tyleo wrote:
           | I had Turn It In mark my work as plagiarism some years ago
           | and I had to fight for it. It was clear the teacher wasn't
           | doing their job and blindly following the tool.
           | 
           | What happened is that I did a Q&A worksheet but in each
           | section of my report I reiterated the question in italics
           | before answering it.
           | 
           | The reiterated questions of course came up as 100% plagiarism
           | because they were just copied from the worksheet.
        
             | pirates wrote:
             | This matches my experience pretty well. My high school was
             | using it 15 years ago and it was a spotty, inconsistent
             | morass even back then. Our papers were turned in over the
             | course of the semester, and late into the year you'd get
             | flagged for "plagiarizing" your own earlier paper.
        
             | teaearlgraycold wrote:
             | Funny how it's the teachers that are plagiarizing the work
             | of the tools.
        
           | vkou wrote:
           | > but I don't think anyone here believes that.
           | 
           | All it takes is one moron with power and a poor understanding
           | of statistics.
        
         | huevosabio wrote:
         | When I was in college, there was a cheating scandal for the
         | final exam where somehow people got their hands on the hardest
         | question of the exam.
         | 
         | The professor noticed it (presumably via seeing poor "show your
         | work") and gave zero points on the question to everyone. And
         | once you went to complain about your grade, she would ask you
         | to explain the answer there in her office and work through the
         | problem live.
         | 
         | I thought it was a clever and graceful way to deal with it.
        
           | j45 wrote:
           | This is a nice approach. The students who know the material,
           | or even who manually prepare before seeing the prof achieve
           | the objective of learning.
        
             | onion2k wrote:
             | It's not great for the teacher though. They're the ones who
             | will truly suffer from the proliferation of AI - increased
             | complexity of work around spotting cheating 'solved' by a
             | huge increase in time pressure. Faced with that teachers
             | will have three options: accept AI detection as gospel
             | without appeals and be accused of unfairness or being bad
             | at the job by parents, spend time on appeals to the
             | detriment of other duties leading to more accusations of
             | being bad at the job, or leave teaching and get an easier
             | (and probably less stressful and higher paid) job. Given
             | those choices I'd pick the third option.
        
               | mavhc wrote:
               | 4. Use AI to talk to the student to find out if they
               | understand.
               | 
               | Tests were created to save money, more students per
               | teacher, we're just going back to the older, actually
               | useful, method of talking to people to see if they
               | understand what they've been taught.
               | 
               | You weren't asked to write an essay because someone
               | wanted to read your essay, only to intuit that you've
               | understood something
        
           | lazyasciiart wrote:
           | Only if she advertised that option somehow. I worked two jobs
           | in college, I didn't take time off to go complain about my
           | grades.
        
             | rgblambda wrote:
             | Not to mention there'd be at least a few students too timid
             | to challenge the teacher, even if they knew they got it
             | right.
        
           | smileysteve wrote:
           | Lol, in 3rd grade algebra, a teacher called 2 of us in for
           | cheating. She had us take the test again, I got the same
           | exact horribly failing score (a 38%) and the cheater got a
           | better score, so the teacher then knew who the cheater was.
           | He just chose the wrong classmate to cheat of of.
        
             | darkwater wrote:
             | I don't get it. If she called you too it was because your
             | results were good, no? Who cheats to get a bad result?
        
               | writebetterc wrote:
               | The students had identical answers, I presume
        
         | neom wrote:
         | Doesn't google docs have fairly robust edit history? If I was a
         | student these days I'd either record my screen of me doing my
         | homework, or at least work in google docs and share the edit
         | history.
        
           | germinalphrase wrote:
           | Yes. When I was an educator, reviewing version history was an
           | obvious way to clarify if/how much students plagiarized.
        
           | HelloUsername wrote:
           | This still leaves many options open for plagiarism (for
           | example a second, seperate device)
        
         | bad_haircut72 wrote:
         | Now imagine this but its a courtroom and you're facing 25 years
        
           | stocksinsmocks wrote:
           | Family law judges, in my small experience, are so
           | uninterested in the basic facts of a case that I would
           | actually trust an LLM to do a better job. Not quite what you
           | mean, but maybe there is a silver lining.
           | 
           | We are already (in the US) living in a system of soft social-
           | credit scores administered by ad tech firms and non-profits.
           | So "the algorithms says you're guilty" has already been
           | happening in less dramatic ways.
        
           | perihelions wrote:
           | https://news.ycombinator.com/item?id=14238786 ( _" Sent to
           | Prison by a Software Program's Secret Algorithms
           | (nytimes.com)"_)
           | 
           | https://news.ycombinator.com/item?id=14285116 ( _'
           | Justice.exe: Bias in Algorithmic sentencing
           | (justiceexe.com)"_)
           | 
           | https://news.ycombinator.com/item?id=43649811 ( _" Louisiana
           | prison board uses algorithms to determine eligility for
           | parole (propublica.org)"_)
           | 
           | https://news.ycombinator.com/item?id=11753805 ( _" Machine
           | Bias (propublica.org)"_)
        
         | johanam wrote:
         | edit history in Google docs is a good way to defend yourself
         | from AI tool use accusations
        
           | andrewinardeer wrote:
           | Ironic that one of the biggest AI companies is also the
           | platform to offer a service to protect yourself from
           | allegations of using it.
        
         | hiddencost wrote:
         | I seriously think the people selling AI detection tools to
         | teachers should be sued into the ground by a coalition of state
         | attorneys general, and that the tools should be banned in
         | schools.
        
         | j45 wrote:
         | Easy if one of these options might be available to the writer:
         | 
         | - Write it in google docs, and share the edit history in the
         | google docs, it is date and time stamped.
         | 
         | - Make a video of writing it in the google docs tab.
         | 
         | If this is available, and sufficient, I would pursue a written
         | apology to remind the future detectors.
         | 
         | Edit: clarity
        
         | obscurette wrote:
         | There have always been problems like this. I had a classmate
         | who wrote poems and short stories since age 6. No teacher
         | believed she wrote those herself. She became a poet, translator
         | and writer and admitted herself later in life that she wouldn't
         | have believed it herself.
        
         | mettamage wrote:
         | I would screencast the whole thing and then tell my professor
         | that we can watch a bit together.
        
         | rkagerer wrote:
         | Guess you have to videotape or screen-record yourself writing
         | it. Oh what a world we've created :-S.
        
           | inerte wrote:
           | You mean you'll prompt Sora to create a video of you writing
           | the essay :)
        
           | rcv wrote:
           | ... until you get accused of generating that video with
           | another AI.
        
             | bigfishrunning wrote:
             | No fair, i was born with 11 fingers!
        
         | jancsika wrote:
         | Seems like this could be practically addressed by teachers
         | adopting the TSA's randomized screening. That is, roll some
         | dice to figure out which student on a given assignment comes in
         | either for the oral discussion or-- perhaps in higher grades--
         | to write the essay in realtime.
         | 
         | It should be way easier than TSA's goal because you don't need
         | to _stop_ cheaters. You instead just need to ensure that you
         | _seed_ skills into a minimal number of achievers so that the
         | rest of the kids see what the _real_ target of education looks
         | like. Kids try their best not to learn, but when the need kicks
         | in they learn _way_ better spontaneously from their peers than
         | any other method.
         | 
         | Of course, this all assumes an effective pre-K reading program
         | in the first place.
        
         | FloorEgg wrote:
         | Write it in something like Google docs that tracks changes and
         | then share the link with the revision history.
         | 
         | If this is insufficient, then there are tools specifically for
         | education contexts that track student writing process.
         | 
         | Detecting the whole essay being copied and pasted from an
         | outside source is trivial. Detecting artificial typing patterns
         | is a little more tricky, but also feasible. These methods
         | dramatically increase the effort required to get away with
         | having AI do the work for you, which diminishes the benefit of
         | the shortcut and influences more students to do the work
         | themselves. It also protects the honest students from false
         | positives.
        
           | fuzzythinker wrote:
           | Thought it is a good idea at first, but can easily be
           | defeated with typing out AI contents. One can add
           | pauses/deletions/edits or true edits from joining ideas
           | different AI outputs.
        
             | FloorEgg wrote:
             | > Detecting artificial typing patterns is a little more
             | tricky, but also feasible.
             | 
             | Keystroke dynamics can detect artificial typing patterns
             | (copying another source by typing it out manually). If a
             | student has to go way out of their way to make their
             | behavior appear authentic then it's decreasing advantage of
             | cheating and less students will do it.
             | 
             | If the student is integrating answers from multiple AI
             | responses then maybe that's a good thing for them to be
             | learning and the assessment should allow it.
        
               | darkwater wrote:
               | It will take 0 time to have some (smarter?) student
               | create an AI agent that mimick keystrokes.
        
         | jstummbillig wrote:
         | How is that a dilemma for the students? What are their supposed
         | options?
        
       | renewiltord wrote:
       | This couldn't have happened at a better time. When I was young my
       | parents found a schooling system that had minimal homework so I
       | could play around and live my life. I've moved to a country with
       | a lot less flexibility. Now when my kids will soon be going to
       | school, compulsory homework will be obsolete.
       | 
       | Zero homework grades will be ideal. Looking forward to this.
        
         | danielbln wrote:
         | If AI gets us reliably to a flipped classroom (=research at
         | home, work through work during class) then I'm here for it.
         | Homework in the traditional sense is an anti pattern.
        
           | mavhc wrote:
           | Agreed, the Gutenberg method is preferred:
           | 
           | 1. Assume printing press exists 2. Now there's no need for a
           | teacher to stand up and deliver information by talking to a
           | class for 60 mins 3. Therefore students can read at home (or
           | watch prepared videos) and test their learning in class where
           | there's experts to support them 4. Given we only need 1 copy
           | of the book/video/interactive demo, we can spend wayyyyy more
           | money making it the best it can possibly be
           | 
           | What's sad is it's 500 years later and education has barely
           | changed
        
             | vkou wrote:
             | > What's sad is it's 500 years later and education has
             | barely changed
             | 
             | From my extensive experience of four years of undergrad,
             | the problem in your plan is "3. Therefore students can read
             | at home " - half the class won't do the reading, and the
             | half that did won't get what it means until they go to
             | lecture[1].
             | 
             | [1] If the lecturer is any good at all. If he spends most
             | of his time ranting about his ex-wife...
        
         | speff wrote:
         | Most of what I learned in college was only because I did
         | homework and struggled to figure it out myself. Classroom time
         | was essentially just a heads up to what I'll actually be
         | learning myself later.
         | 
         | Granted, this was much less the case in grade school - but if
         | students are going to see homework for the first time in
         | college, I can see problems coming up.
         | 
         | If you got rid of homework throughout all of the "standard"
         | education path (grade school + undergrad), I would bet a lot of
         | money that I'd be much dumber for it.
        
           | vkou wrote:
           | > but if students are going to see homework for the first
           | time in college, I can see problems coming up.
           | 
           | If the concept is too foreign for them, I'm sure we could
           | figure out how to replicate the grade school environment.
           | Give them their 15 hours/week of lecture, and then lock them
           | in a classroom for the 30 hours they should spend on
           | homework.
        
         | lazyasciiart wrote:
         | Now _that 's_ an optimistic take!
        
       | ecshafer wrote:
       | In my CS undergrad I had Doug Lea as a professor, really
       | fantastic professor (best teacher I have ever had, bar none). He
       | had a really novel way to handle homework hand ins, you had to
       | demo the project. So you got him to sit down with you, you ran
       | the code, he would ask you to put some inputs in (that were
       | highly likely to be edge cases to break it). Once that was
       | sufficient, he would ask you how you did different things, and to
       | walk him through your code. Then when you were done he told you
       | to email the code to him, and he would grade it. I am not sure
       | how much of this was an anti-cheating device, but it required
       | that you knew the code you wrote and why you did it for the
       | project.
       | 
       | I think that AI has the possibility of weakening some aspects of
       | education but I agree with Karpathy here. In class work, in
       | person defenses of work, verbal tests. These were corner stones
       | of education for thousands of years and have been cut out over
       | the last 50 years or so outside of a few niche cases (Thesis
       | defense) and it might be a good thing that these come back.
        
         | SirMaster wrote:
         | So we are screwed once we get brain-computer interfaces?
        
         | mercacona wrote:
         | Yep, it's easy to shortcut AI plagiarism, but you need time. In
         | most of the universities around the world (online universities
         | especially), the number of students is way too big, while
         | professors get more and more pressure on publishing and
         | bureaucracy.
        
           | ecshafer wrote:
           | I did my masters in GaTech OMSCS (Chatgpt came out at the
           | very end of my last semester). Tests were done with cameras
           | on and it was recorded and then they were watched I think by
           | TAs. Homework was done with automated checking and a
           | plagiarism checker. Do you need to have in person proctoring
           | via test centers or libraries? Video chats with professors? I
           | am not sure. Projects are importants, but maybe they need to
           | become a minority of grades and more being based on theory to
           | circumvent AI?
        
           | ghaff wrote:
           | It's not even about plagiarism. But, sure, 1:1 or even 1:few
           | instruction is great but even at elite schools is not really
           | very practical. I went to what's considered a very good
           | engineering school and classes with hundreds of students was
           | pretty normal.
        
       | charcircuit wrote:
       | This doesn't adress the point that AI can replace going to
       | school. AI can be your perfect personal tutor to help you learn
       | thing 1:1. Needing to have a teacher and prove to them that you
       | know what they teached will become a legacy concept. That we have
       | an issue of AI cheating at school is in my eyes a temporary
       | issue.
        
         | alariccole wrote:
         | ChatGPT just told me to put the turkey in my toaster oven legs
         | facing the door, and you think it can replace school. Unless
         | there is a massive architectural change that can be provably
         | verified by third parties, this can never be. I'd hate for my
         | unschooled surgeon to check an llm while I'm under.
        
           | CamperBob2 wrote:
           | Just curious, not being a turkey SME, what's the downside to
           | positioning the turkey that way?
        
             | patrickmay wrote:
             | Most turkeys of my acquaintance would not fit into a
             | toaster oven without some percussive assistance.
        
               | CamperBob2 wrote:
               | I see, I overlooked the 'toaster' part. That's a good
               | world model benchmark question for models and a good
               | reading comprehension question for humans. :-P
               | 
               | GPT 5.1 Pro made the same mistake ("Face the legs away
               | from the door.") Claude Sonnet 4.5 agreed but added
               | "Note: Most toaster ovens max out around 10-12 pounds for
               | a whole turkey."
               | 
               | Gemini 3 acknowledged that toaster ovens are usually very
               | compact and that the legs shouldn't be positioned where
               | they will touch the glass door. When challenged, it hand-
               | waved something to the effect of "Well, some toaster
               | ovens are large countertop convection units that can hold
               | up to a 12-pound turkey." When asked for a brand and
               | model number of such an oven, it backtracked and admitted
               | that no toaster oven would be large enough.
               | 
               | Changing the prompt to explicitly specify a 12-pound
               | turkey yielded good answers ("A 12-pound turkey won't fit
               | in a toaster oven - most max out at 4-6 pounds for
               | poultry. Attempting this would be a fire hazard and
               | result in dangerously uneven cooking," from Sonnet.)
               | 
               | So, progress, but not enough.
        
           | charcircuit wrote:
           | What's the alternate if someone didn't know something during
           | a procedure? Just wing it? Getting another data point from an
           | LLM seems beneficial to me.
        
             | axus wrote:
             | Ask a human who does. If there are no competent humans on-
             | call before the procedure starts, reschedule the procedure.
        
         | qingcharles wrote:
         | For someone that wants to learn, I agree with this 100%. AI has
         | been great at teaching me about 100s of topics.
         | 
         | I don't yet know how we get AI to teach unruly kids, or kids
         | with neurodivergencies. Perhaps, though, the AI can eventually
         | be vastly superior to an adult because of the methods it can
         | use to get through to the child, keep the child interested and
         | how it presents the teaching in a much more interactive way.
        
         | sharkjacobs wrote:
         | It is considered valuable and worthwhile for a society to
         | educate all of its children/citizens. This means we have to
         | develop systems and techniques to educate all kinds of people,
         | not just the ones who can be dropped off by themselves at a
         | library when they turn five, and picked up again in fifteen
         | years with a PHD.
        
           | charcircuit wrote:
           | Sure. People who are self motivated are who will benefit the
           | earliest. If a society values ensuring every single citizen
           | gets a baseline education they can figure out how to get an
           | AI to persuade or trick people into learning better than a
           | human could.
        
         | nathan_compton wrote:
         | Snap out of it. This is the best advice I can give you.
        
           | charcircuit wrote:
           | Snap out of what? I use chatgpt for learning every day.
        
       | qsort wrote:
       | It's a fair question, but there's maybe a bit of US defaultism
       | baked in? If I look back at my exams in school they were mostly
       | closed-book written + oral examination, nothing would really need
       | to change.
       | 
       | A much bigger question is what to teach assuming we get models
       | much more powerful than those we have today. I'm still confident
       | there's an irreducible hard core in most subjects that's well
       | worth knowing/training, but it might take some soul searching.
        
       | mark242 wrote:
       | "You have to assume that any work done outside classroom has used
       | AI."
       | 
       | That is just such a wildly cynical point of view, and it is
       | incredibly depressing. There is a whole huge cohort of kids out
       | there who genuinely want to learn and want to do the work, and
       | feel like using AI is cheating. These are the kids who,
       | ironically, AI will help the most, because they're the ones who
       | will understand the fundamentals being taught in K-12.
       | 
       | I would hope that any "solution" to the growing use of AI-as-a-
       | crutch can take this cohort of kids into consideration, so their
       | development isn't held back just to stop the less-ethical student
       | from, well, being less ethical.
        
         | tgv wrote:
         | > There is a whole huge cohort of kids out there
         | 
         | Well, it seems the vast majority doesn't care about cheating,
         | and is using AI for everything. And this is from primary school
         | to university.
         | 
         | It's not just that AI makes it simpler, so many pupils cannot
         | concentrate anymore. Tiktok and others have fried their mind.
         | So AI is a quick way out for them. Back to their addiction.
        
           | drivebyhooting wrote:
           | Addiction created by you and me, laboring for Zuck's profit.
           | 
           | There's a reason this stuff is banned in China. Their pupils
           | suffer no such opiate.
        
         | sharkjacobs wrote:
         | Sure, but the point is that if 5% of students are using AI then
         | you have to assume that any work done outside classroom has
         | used AI, because otherwise you're giving a massive advantage to
         | the 5% of students who used AI, right?
        
         | techblueberry wrote:
         | What possible solution could prevent this? The best students
         | are learning on their own anyways, the school can't stop
         | students using AI for their personal learning.
         | 
         | There was a reddit thread recently that asked the question, are
         | all students really doing worse, and it basically said that,
         | there are still top performers performing toply, but that the
         | middle has been hollowed out.
         | 
         | So I think, I dunno, maybe depressing. Maybe cynical, but
         | probably true. Why shy away from the truth?
         | 
         | And by the way, I would be both. Probably would have used AI to
         | further my curiosity and to cheat. I hated school, would
         | totally cheat to get ahead, and am now wildly curious and
         | ambitious in the real world. Maybe this makes me a bad person,
         | but I don't find cheating in school to be all that unethical.
         | I'm paying for it, who cares how I do it.
         | 
         | People aren't one thing.
        
           | ACCount37 wrote:
           | AI is a boon to students who are intrinsically motivated.
           | Most students aren't.
        
       | KerryJones wrote:
       | It seems like a good path forward is to somewhat try to replicate
       | the idea of "once you can do it yourself, feel free to use it
       | going forward" (knowing how various calculator operations work
       | before you let it do it for you).
       | 
       | I'm curious if we instead _gave_ students an AI tool, but one
       | that would intentionally throw in _wrong_ things that the student
       | had to catch. Instead of the student using LLMs, they would have
       | one paid for by the school.
       | 
       | This is more brainstorming then a well thought-out idea, but I
       | generally think "opposing AI" is doomed to fail. If we follow a
       | montessori approach, kids are _naturally inclined_ to want to
       | learn thing, if students are trying to lie /cheat, we've already
       | failed them by turning off their natural curiosity for something
       | else.
        
         | jay_kyburz wrote:
         | I agree, I think schools and universities need to adapt, just
         | like calculators, these things aren't going away. Let students
         | leverage AI as tools and come out of Uni more capable than we
         | did.
         | 
         | AI _do_ currently throw in an occasional wrong thing. Sometimes
         | a lot. A students job needs to be verifying and fact checking
         | the information the AI is telling them.
         | 
         | The student's job becomes asking the right questions and
         | verifying the results.
        
       | sharkjacobs wrote:
       | > The students remain motivated to learn how to solve problems
       | without AI because they know they will be evaluated without it in
       | class later.
       | 
       | Learning how to prepare for in-class tests and writing exercises
       | is a very particular skillset which I haven't really exercised a
       | lot since I graduated.
       | 
       | Never mind teaching the humanities, for which I think this is a
       | genuine crisis, in class programming exams are basically the same
       | thing as leetcode job interviews, and we all know what a bad
       | proxy those are for "real" development work.
        
         | iterateoften wrote:
         | I use it every day.
         | 
         | Preparing for a test requires understanding what the instructor
         | wants. concentrate on the wrong thing get marked down.
         | 
         | Same applies to working in a corporation. You need to
         | understand what management wants. It's a core requirement.
        
         | yannyu wrote:
         | > in class programming exams are basically the same thing as
         | leetcode job interviews, and we all know what a bad proxy those
         | are for "real" development work.
         | 
         | Confusing university learning for "real industry work" is a
         | mistake and we've known it's a mistake for a while. We can have
         | classes which teach what life in industry is like, but assuming
         | that the role of university is to teach people how to fit
         | directly into industry is mistaking the purpose of university
         | and K-12 education as a whole.
         | 
         | Writing long-form prose and essays isn't something I've done in
         | a long time, but I wouldn't say it was wasted effort. Long-form
         | prose forces you to do things that you don't always do when
         | writing emails and powerpoints, and I rely on those skills
         | every day.
        
           | crooked-v wrote:
           | There's no mistake there for all the students looking at job
           | listings that treat having a college degree as a hard
           | prerequisite for even being employable.
        
       | mercacona wrote:
       | I've been following this approach since last school year. I focus
       | on in-class work and home-time is for reading and memorization.
       | My classmates still think classrooms are for lecturing, but it's
       | coming. The paper-and-pen era is back to school!
        
       | Kelvinidan wrote:
       | I recently wrote on something similar. I think the way we design
       | evaluation methods for students needs to mirror the design of
       | security systems. https://kelvinpaschal.com/blog/educators-
       | hackers/
        
       | TheAceOfHearts wrote:
       | I think legacy schooling just needs to be reworked. Kids should
       | be doing way more projects that demonstrate the integration of
       | knowledge and skills, rather than focusing so much energy on
       | testing and memorization. There's probably a small core of things
       | that really must be fully integrated and memorized, but for
       | everything else you should just give kids harder projects which
       | they're expected to solve by leveraging all the tools at their
       | disposal. Focus on teaching kids how to become high-agency beings
       | with good epistemics and a strong math core. Give them
       | experiments and tools to play around and actually understand how
       | things work. Bring back real chemistry labs and let kids blow
       | stuff up.
       | 
       | The key issue with schools is that they crush your soul and turn
       | you into a low-agency consumer of information within a strict
       | hierarchy of mind-numbing rules, rather than helping you develop
       | your curiosity hunter muscles to go out and explore. In an ideal
       | world, we would have curated gardens of knowledge and information
       | which the kids are encouraged to go out and explore. If they find
       | some weird topic outside the garden that's of interest to them,
       | figure out a way to integrate it.
       | 
       | I don't particularly blame the teachers for the failings of
       | school though, since most of them have their hands tied by strict
       | requirements from faceless bureaucrats.
        
         | yannyu wrote:
         | As much as I hated schooling, I do want to say that there are
         | parts of learning that are simply hard. There are parts that
         | you can build enthusiasm for with project work and prioritizing
         | for engagement. But there are many things that people should
         | learn that will require drudgery to learn and won't excite all
         | people.
         | 
         | Doing derivatives, learning the periodic table, basic language
         | and alphabet skills, playing an instrument are foundational
         | skills that will require deliberate practice to learn,
         | something that isn't typically part of project based learning.
         | At some point in education with most fields, you will have to
         | move beyond concepts and do some rote memorization and
         | repetition of principles in order to get to higher level
         | concepts. You can't gamify your way out of education, despite
         | our best attempts to do so.
        
           | FloorEgg wrote:
           | Most learning curves in the education system today are very
           | bumpy and don't adapt well to the specific student. Students
           | get stuck on big bumps or get bored and demotivated at
           | plateaus.
           | 
           | AI has potential to smooth out all curves so that students
           | can learn faster and maximize time in flow.
           | 
           | I've spent literally thousands of hours thinking about this
           | (and working on it). The future of education will be as
           | different from today as today is to 300 years ago.
           | 
           | Kids used to get smacked with a stick if they spelled a word
           | wrong.
        
             | seg_lol wrote:
             | There is a huge opportunity here to have the stick smacking
             | be automated and timed to perfection.
        
         | SunshineTheCat wrote:
         | You are 100% right on this. There is a reason school is so
         | vastly different from the process most people follow when
         | learning something on their own.
         | 
         | Doing rather than memorizing outdated facts in a textbook.
        
       | kingstnap wrote:
       | Having had some experience teaching and designing labs and
       | evaluating students in my opinion there is basically no problem
       | that can't be solved with more instructor work.
       | 
       | The problem is that the structure pushes for teaching
       | productivity which basically directly opposes good pedagogy at
       | this point in the optimization.
       | 
       | Some specifics:
       | 
       | 1. Multiple choice sucks. It's obvious that written response
       | better evaluates students and oral is even better. But multiple
       | choice is graded instantly by a computer. Written response needs
       | TAs. Oral is such a time sink and needs so many TAs and lots of
       | space if you want to run them in parallel.
       | 
       | 1.5 Similarly having students do things on computers is nice
       | because you don't have to print things and even errors in the
       | question can be fixed live and you can ask students to refresh
       | the page. But if the chatbots let them cheat too easily on
       | computers doing hand written assesments sucks cause you have to
       | go arrange for printing and scanning.
       | 
       | 2. Designing labs is a clear LLM tradeoff. Autograded labs with
       | testbenches and fill in the middle style completetions or API
       | completetions are incredibly easy to grade. You just pull the
       | commit before some specific deadline and run some scripts.
       | 
       | You can do 200 students in the background when doing other work
       | its so easy. But the problem is that LLMS are so good at fill in
       | the middle and making testbenches pass.
       | 
       | I've actually tried some more open ended labs before and its
       | actually very impressive how creative students are. They are
       | obviously not LLMs there is this diversity in thought and
       | simplicity of code that you do not get with ChatGPT.
       | 
       | But it is ridiculously time consuming to pull people's code and
       | try to run open ended testbenches that they have created.
       | 
       | 3. Having students do class presentations is great for evaluating
       | them. But you can only do like 6 or 7 presentations in a 1 hr
       | block. You will need to spend like a week even in a relatively
       | small class.
       | 
       | 4. What I will say LLMs are fun for are having students do open
       | ended projects faster with faster iterations. You can scope creep
       | them if you expect expect to use AI coding.
        
       | SunshineTheCat wrote:
       | I think part of the reason AI is having such a negative effect on
       | schools in particular is because of how many education processes
       | are reliant on an archaic, broken way of "learning." So much of
       | it is focused upon memorization and regurgitation of information
       | (which AI is unmatched at doing).
       | 
       | School is packed with inefficiency and busywork that is
       | completely divorced from the way people learn on their own. In
       | fact, it's pretty safe to say you could learn something about 10x
       | by typing it into an AI chat bot and having it tailor the
       | experience to you.
        
         | FloorEgg wrote:
         | Yes, the biggest problem with authentic exercises is evaluating
         | the students' actions and giving feedback. The problem is that
         | authentic assessments didnt previous scale (e.g. what worked in
         | 1:1 coaching or tutoring couldn't be done for a whole
         | classroom). But AI can scale them.
         | 
         | It seems like AI will destroy education but it's only breaking
         | the old education system, it will also enable a new and much
         | better one. One where students make more and faster progress
         | developing more relevant and valuable skills.
         | 
         | Education system uses multiple choice quizzes and tests because
         | their grading can be automated.
         | 
         | But when evaluation of _any_ exercise can be automated with AI,
         | such that students can practice any skill with iterative
         | feedback at the pace of their own development, so much human
         | potential will be unlocked.
        
       | bilsbie wrote:
       | I submitted this but why is there an xcancel link added to it?
        
         | crabmusket wrote:
         | X is a hostile experience when not logged in.
        
       | paulorlando wrote:
       | I did a lot of my blog and book writing before these AI tools,
       | but now I show my readers images of handwritten notes and drafts
       | (more out of interest than demonstrating proof of work).
        
       | ishtanbul wrote:
       | Here is my proposal for AI in schools: raise the bar
       | dramatically. Rather than trying to prevent kids from using AI,
       | just raise the expectations of what they should accomplish with
       | it. They should be setting really lofty goals rather than just
       | doing the same work with less effort.
        
         | alexose wrote:
         | Absolutely. I'd love to see the same effect happen in the
         | software industry. Keep the volume of output the same, but
         | increase the quality.
        
           | ojr wrote:
           | that is what they do in the software industry, before it was
           | let me catch you off guard with asking how to reverse a
           | linked list, now its leetcode questions that are so hard that
           | you need to know and study them weekly, and prep for a year,
           | interviewer can tell if you started prep 3 weeks prior
        
           | oytis wrote:
           | > Keep the volume of output the same, but increase the
           | quality.
           | 
           | Effect of AI applied to coding is precisely the opposite
           | though?
        
         | oytis wrote:
         | AI doesn't help you do higher quality work. It helps you do (or
         | imitate) mediocre work faster. But thing is, it is hard to
         | learn how to do excellent work without learning to do mediocre
         | work first.
        
       | enjeyw wrote:
       | I made a tool for this! It's an essay writing platform that
       | tracks the edits and keystrokes rather than the final output, so
       | its AI detection accuracy is _much_ higher than other tools:
       | https://collie.ink/
        
       | 11101010001100 wrote:
       | As a teacher, I try to keep an open mind, but consistently I can
       | find out in 5 minutes of talking to a student if they understand
       | the material. I might just go all in for the oral exams.
        
       | adidoit wrote:
       | This is exactly why I'm focusing on job readiness and remediation
       | rather than the education system. I think working all this out is
       | simply too complex for a system with a lot of vested interest and
       | that doesn't really understand how AI is evolving. There's an
       | arms race between students, teachers, and institutions that hire
       | the students.
       | 
       | It's simply too complex to fix. I think we'll see increased
       | investment by corporates who do keep hiring on remediating the
       | gaps in their workforce.
       | 
       | Most elite institutions will probably increase their efforts
       | spent on interviewing including work trials. I think we're
       | already seeing this with many of the elite institutions talking
       | about judgment, emotional intelligence critical thinking as more
       | important skills.
       | 
       | My worry is that hiring turns into a test of likeability rather
       | than meritocracy (everyone is a personality hire when cognition
       | is done by the machines)
       | 
       | Source: I'm trying to build a startup (Socratify) a bridge for
       | upskilling from a flawed education system to the workforce for
       | early stage professionals
        
       | theideaofcoffee wrote:
       | How about just dispense with the AI nonsense in education and go
       | to totally in-person, closed-book, manually-written, proctored
       | exams? No homework, no assignments, no projects. Just pure mind-
       | to-paper writing in a bare room under the eye of an examiner.
       | Those that want to will learn and will produce intelligent work
       | regardless of setting.
        
       | thinkindie wrote:
       | this is a very American issue. In my entire student career in
       | Italy, home assignments were never graded. Maybe you had a
       | project or two through university, but otherwise I got all my
       | grades for onsite tests.
        
       | cadamsdotcom wrote:
       | AI can generate questions. It is feasible in principle to give
       | every student a different exam - and use the same or another AI
       | to ensure they're all exactly the same difficulty, either by
       | generating them carefully, or by generating lots of candidate
       | exams and rejecting the ones that are too hard/too easy.
        
       ___________________________________________________________________
       (page generated 2025-11-24 23:00 UTC)