(C) Daily Kos This story was originally published by Daily Kos and is unaltered. . . . . . . . . . . FYI on AI (Hint: Not So Good) [1] ['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.'] Date: 2024-11-17 As a teacher, teacher educator, researcher and writer, I have been playing around with different AI studios. I prefer Meta AI to the various ChatGPT formats because it provides footnotes with links to the sources it references. According to the Pew Research Center, about one out of four Americans have used ChatGPT since it was inaugurated in 2022. I’ve used AI to design lesson plans, complete student assignments, and as a search engine. An article in The Washington Post based on research by University of California, Riverside scientists about the environmental consequences of AI has made me reconsider whether I ever want to use it again. AI is already built into search engines and email systems and used billions of times a day. A 100-word email generated by ChatGPT-4 requires the equivalent of one bottle of water to cool a processing plant at its data center. AI generated lesson plans use a standard format that produces boring and repetitive lessons. A bigger problem is that they do not account for the variability that is at the core of professional teaching. Teachers adapt lessons to their students depending on the time of day, levels of student interest and performance in different classes, and the student population. You can teach the same lesson five times, and it is never the same, except in the world of AI that does not take into account that students are diverse human beings. An assignment I used to give high school students, and I now have pre-service teachers complete is to write a 500-word essay explaining the significance and key ideas of a primary source historic document. Meta AI “wrote” a 390-word answer to the assignment so I asked it additional questions to get 600 words that I could edit down. The final result was not a bad essay, vocabulary was appropriate, and it could have been written by a 11th grade student, but there was a red flag. The AI essay contained information about the document that a student would not know based on a close reading of the text or discussion in class. If a student handed this in as their original work, a teacher should have picked it up immediately. The Washington Post article raised additional problems with AI that I had not previously considered. Each AI request requires computers to run thousands of calculations to assemble answers. These operations generate heat that would melt down the computers, so electricity and water are used to keep the computer servers at the data centers from overheating and breaking down. In Northern Virginia communities, citizen groups are protesting against the construction of new data centers because they are loud, energy hogs, eye sores, and don’t provide long-term employment. In The Dalles, Oregon, data centers use about a quarter of all water available for residents of the town. An additional problem is that an AI data center’s drain on electricity and water drives up utility bills for people in the surrounding area. An MIT Technology Review found that artificial intelligence software used to generate images had the same carbon footprint as flying across the United States. Generating only one image uses much energy as fully charging a smartphone. Google claims, “Since our earliest days, we’ve been on an ambitious journey to help build a more sustainable future,” and argues that “AI has the potential to help solve some of climate’s biggest challenges. Scaling AI and using it to accelerate climate action will be just as crucial as addressing the environmental impact associated with it.” However, a report released in July 2024 on Google’s environmental impact the previous year showed that its “carbon emission footprint” rose by 48% compared to its 2019 target because of AI data centers. Google also failed miserably to meet its environmental goals for water use. My advice to students is to be aware of the environmental consequences of AI and to use it judiciously, if the tech companies will allow it. When doing research, like Wikipedia and Google, Meta AI and ChatGPT should be a first stop to get ideas and references not the last stop. Students who use AI to generate assignments and then hand it in as their own work may not get caught, but they are only cheating themselves because they are not learning the material and developing the skills that they will need as they advance in school, gets jobs, and lead productive lives. As the saying goes, “cheaters never prosper.” Former New York City Schools Chancellor David Banks championed the use of artificial intelligence programs arguing that “AI can revolutionize how we function as a school system” by making evaluation of student work easier, personalizing instruction, and targeting remediation. I agree these things are possible, but technological miracles have been disappointing in the past. Focusing on AI can become an excuse not to deal with some of the more basic problems impacting on student learning. In New York City, during the 2022-2023 school year over 100,000 students were homeless at some point, over 250,000 are English language learners, and one out of four children come from families living in poverty. In addition, there are exceedingly high teacher turnover rates. Over 40% of the teachers hired in the 2012-13 school year left the school system within five years. [END] --- [1] Url: https://www.dailykos.com/stories/2024/11/17/2286900/-FYI-on-AI-Hint-Not-So-Good?pm_campaign=front_page&pm_source=more_community&pm_medium=web Published and (C) by Daily Kos Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified. via Magical.Fish Gopher News Feeds: gopher://magical.fish/1/feeds/news/dailykos/