AI use in schools fits right into the authoritarian playbook of 'keep them ignorant'. Critical thinking skills are crucial in developing solutions to problems. Problem solving is essential to improve the lives of everyone everywhere. When the populace uses these skills, they begin to question the dissatisfactory decisions of those in charge, essentially what a fascist dictator fears.
I don't think critical thinking for problem-solving automatically leads to criticizing leadership. Look at China, they seem to have incorporated problem-solving without having that problem. They wouldn't be considered a threat to western powers if they hadn't.
China's problem is the government will destroy you if you don't tow the line. (just what trump is trying to do). And of course AI doesn't lead to criticizing leadership. Having a working brain that sees leadership is trying to keep us ignorant is what leads to criticizing the government.
There don't seem to be enough working brains these days. Look at who is in the White House. Whereas I knew Trump's type was bad news from the moment he declared his first run for president, when even my own family thought he was nothing but a clown. This is why we must all study history as if our lives depended on it. Because they do.
Yeah, it feels like we're going to get AI continuously shoved down our throats. Not all uses of artificial intelligence are bad per se, but context matters, and particularly for large models, the environmental implications shouldn't be ignored.
As a teacher I have seen the use of AI by students increase dramatically in the past 2-3 years. Students are predominantly using this as a shortcut, not to deepen learning. I don’t blame them for doing this as the school day is a grind, with the majority of middle school and high school students carrying a load of 6-8 courses. That in itself is a problem, but this article is about the use of AI.
No technology is all good or all bad but I see more downside with the use of AI in schools and society as a whole. One of the reasons for the explosion in the acceptance of conspiracy theories is that smart phones have absolved us of taking responsibility for our knowledge( I am speaking to society as a whole and not every individual). We used to have to work for knowledge and therefore feel a sense of responsibility for what we said. Now people can just regurgitate whatever appears on their screen and cite that as “proof”. I think AI will make this problem worse.
Mike, you make a good point. I think smart phones are making us lazy, at the least, and probably stupider. And AI is the next step. When was the last time anyone had to memorize a phone number? I remember the days when I could recall a phone number or even the locker combination from my 7th grade locker (in the early 1970s). Now I don't even know most of my family members' phone numbers because I have an entire "phone book" in my hand/pocket.
Where to start? Let's start with China. If there was ever a country that wanted to put hundreds of millions of people into the most menial task, no-brain-needed workforce, that would be China. What an opportunity to have more control over people starting with the babes. Let them learn...up to a point, then let them rely on AI to think.
I retired from 25 years in a NYC high school about two minutes after AI was introduced to the populace. My timing was coincidental but it probably saved me from the frustration I would feel everyday fighting another machine and tech-driven push into education. Big tech has been trying for decades to figure out how to take that big pot of money that goes to educating our children and make it their money.
We already have a lazy populace and children learn to take the easy route. AI is not a tool for learning. It is very obviously a tool for unlearning, or dislearning or not learning.
Pope Leo (and Francis before him) has said that machine learning, while having great potential to help mankind, has more the potential to rob mankind of its dignity, especially the poor and the working class.
How nice and warm the water is, the froggies think as they lazily kick around the pot.
A retired English prof, I have seen first-hand that technology that gives students an easy way out - free essays online, research through general search engines that allow cut&paste info, paper downloads - quickly become their go-to. Their acceptance of cheating increased, seen as wrong only in that they got caught, not that cheating is wrong. Subsequently, this was followed by an inability to discern fact from opinion or misinformation, to think critically or to problem solve. Learning takes work and the exercise of the brain. I used to tell students to think of their brain as their thinking muscle, and just like other muscles, it weakens and atrophies with disuse. This is a grave threat to a democracy that relies on an informed and critically thinking electorate. It's also a grave threat to future jobs as more and more becomes automated and less person-to-person interaction occurs.
My daughter who goes to community college uses AI. She is autistic and has trouble with structured learning. As you may or may not know, today's school systems, for neurodivergent kids, teach to them in a manner that's akin to asking turtles to climb trees.
My daughter was raised in a new age, one that is in many ways inconceivable to me. I grew up in the 70s and 80s. When I played video games on my Coleco or Super Nintendo, I might spend weeks of frustration trying to figure out the next move in a strategy game like Legend of Zelda. There were no resources where I could look up answers. You just hoped you'd stumble onto the answer and maybe, sometimes, you just didn't. Ever. And gave up.
My daughter, when playing games in the late 00s and throughout the 2010s, simply went onto YouTube and watched a playthrough. Was she a cheater? Or was she resourceful? You could argue that she is less successful at problem solving than I was at her age and you might be right. To her, though, it was the natural solution. The technology was there. She utilized it. I couldn't prevent her from it if I wanted to unless I locked her in a fairytale tower.
I can't keep her from using AI now. She is an adult of voting age. What I can do and what her school has done, is encourage her to use it wisely. To never, ever copy and paste and to recognize that that is an unacceptable shortcut and that it is cheating. So I ask her to summarize and paraphrase as often as possible. Is it then her organic voice? No, not really. But it will probably never be that again for any young person who knows that the possibility of using AI is real and that it exists in the click of a few buttons.
What am I going to do? Insist that we go back to a quill, an inkwell and parchment by candlelight? We are in a whole new world. We must stop our handwringing and adapt *ourselves* and our old ways of thinking. Parenting and teaching are a whole new frontier for us. Who are we to say that we know better when it probably seems to our kids that we just don't know anything anymore. We no longer have the answers but we also don't even have the right questions yet. Welcome to technological advancement in the, as my teenage son says, gargantuan 2025.
Education which no longer educates children in order to develop their minds, but only spoon feeds them is not useful. As you already know, your daughter is a special case. What is helpful and useful for her, and should definitely be used in her case, would be destructive to other students if they were allowed to use it. Competent, well-managed educational systems are willing and able to make such allowances.
I am a retired IT pro who still writes a lot of code. With AI’s help, I can write a new system in way less time than I could before (hooray!) but it’s often really crappy code that should be cleaned up (boo!). It’s especially helpful when the system requirements are an ancient or obscure technology since it can do the heavy research instantaneously.
It’s making me lazy.
That said:
AI isn’t infallible (everything here is anecdotal)
1) One time it included random code from a previous system; when I pointed this out, it said something akin to: “You’re right! I was told to totally forget all previous work but I didn’t.”
2) AI code sometimes contains hidden bugs only found during rigorous testing. This takes time
3) AI produces working code I do not understand and/or is very poorly written that should be studied and cleaned up. This takes time
4) (I am not aware of this happening yet, but can easily imagine it): AI could be taught to include intentional security holes. Critical code should be thoroughly checked. This takes time.
Testing and cleanup are arguably the least fun part of programming, and are often the first to fall to the corporate budgetary axe as deadlines approach. Are we comfortable with that?
Yes, it still needs human oversight. You still need to know exactly how to request. AI is like a 2 year old (or maybe a teenager) that does EXACTLY what you say except for the times that it doesn't. As a programmer, I personally find writing the code the best part; testing, fixing, and cleanup is more of a chore. I wouldn't say it sucks, but it is less fun.
Will AI eventually replace me? My guess is "not-completely": I will still be needed when something goes pear-shaped, not unlike rescuing the Roomba when it gets caught under the sofa.
How does AI generate code affect regression testing? (Used to be an IT tech writer).
If I was still tech writing, I think I would find AI helpful for explaining concepts and tech I had to learn to do my job. Sometimes the learning was like a fire hose and didn't lead to the deep understanding needed to write docs and manuals. I'd still be triple checking though.
It's worth the time to read these studies. I just took a quick look.
That said, I can't imagine why there is support from schools to let students write using AI. Learning to write is fundamental to a liberal education. There are so many things in everyday life that require an understanding of how a subject is presented, why the information is meaningful and on and on. Knowing how to do something for yourself adds real value.
Other than profit, I don't get the push to apply AI to learning.
Thanks so much for this thoughtful piece. I see 2 additional significant issues: students believe AI is always correct, and the industry/tech isn't sustainable. I thought this piece summed that problem up well, but i would love to know if others see the same issue. Are the same people running these AI companies the sam people who caused the dot com bust? https://www.wheresyoured.at/wheres-the-money/
Another possible direction increased use of AI may bring is reduction in brick-and-mortar schools and physical teachers. We have to ask, if an AI based education system is implemented, at what point will teachers become unnecessary? The student will be guided by an AI assistant with a physical teacher relegated to monitoring only. Will students need to be in a physical classroom? The concern with AI is valid; our brains develop better if one has to develop critical thinking skills without assistance. AI will lessen that critical skill as students begin to depend more on AI and less on their own ability.
If you've ever actually taught in a school, you realize (as the COVID shut downs have shown) that at least half of the purpose of school is allowing students to learn to interact with a wide variety of different people in positive ways while building your own sense of self and self confidence in such interactions with a good part of the job of school staff being to put guardrails around all that so that certain norms of behavior and interaction are kept in place.
Moving all education to individually siloed, AI settings will lead us to a society which is not a society at all; a society which is rendered increasingly dysfunctional because the need we humans have for interaction, especially in our adolescent years, will go unfulfilled, leading to increasing levels of depression and anxiety among our youngsters and non existent birth rates.
I agree and would opine that, with the advent of social media, your concerns already appear to be presenting themselves. Kids together in a group, all looking at their phones, is not my idea of healthy social interaction.
My kids both despise online learning. It's far less engaging than actual school. Have any of the article's authors or the commenters here actually spoken to a kid lately? This is all a bunch of adults lamenting how awful AI is and what the studies say about cognitive this and critical thinking that.
What do the students say? My kids hates his online school's crappy AI learning platform. He can't wait to go back to a brick and mortar school with his friends.
Do you think the kids think they're lazy? I assure you, school is still hard AF. Much, much moreso for neurodivergent kids who are finally being being recognized now that the diagnostic criteria for ADHD and autism has dramatically improved in recent years.
Why don't we ask student how they feel about it? I'll bet they aren't as impressed with AI or as lazy and uninspired as a lot of older people seem to think they are.
I would agree as I, in college, really hated online classes. To me they were disruptive and inefficient in that actual idea exchange was cumbersome and often limited by who could type faster. As an older student I couldn't keep up nor could I decipher the language shortcuts familiar to young kids.
The tech companies are shoving AI on us. Meta AI asked the other day if I’d like AI to summarize my WhatsApp messages. Uh, no, I want to read the actual messages from friends and colleagues. No, CoPilot, I don’t want your help and I don’t want you to be the first thing I see, slowing access to the documents I need. My college freshman affirms that AI is used regularly for cheating. The NYT had an AI video quiz last week - very difficult to discern real from bs. Scary and stupid.
i'm surprised trump is for this......after all his parents had to pay money for all those essays he submitted. why should his kid and grandkids get it for "free"
Trump doesn't understand any of this, but his handlers are telling him that AI can be made to make Trump look like the most brilliant president ever, so he's all for it.
This has been my greatest concern since the beginning of AI discussions. I feared it would lover the level of the human brain in its abiolity to think, to learn, to reason, etc. It is HORRIBLE and it's all about a fight for who can profit MOST by the development of their AI business/marketing.
And of course, the amount of energy needed to run AI is off the rails, so harmful and damaging. What can be done though, aside from sharing wonderful articles and research like your own? I am SO SO Concerned about waht it will do to young minds still finding their way to being literate, smart, thinkers.
AI use in schools fits right into the authoritarian playbook of 'keep them ignorant'. Critical thinking skills are crucial in developing solutions to problems. Problem solving is essential to improve the lives of everyone everywhere. When the populace uses these skills, they begin to question the dissatisfactory decisions of those in charge, essentially what a fascist dictator fears.
I don't think critical thinking for problem-solving automatically leads to criticizing leadership. Look at China, they seem to have incorporated problem-solving without having that problem. They wouldn't be considered a threat to western powers if they hadn't.
China's problem is the government will destroy you if you don't tow the line. (just what trump is trying to do). And of course AI doesn't lead to criticizing leadership. Having a working brain that sees leadership is trying to keep us ignorant is what leads to criticizing the government.
There don't seem to be enough working brains these days. Look at who is in the White House. Whereas I knew Trump's type was bad news from the moment he declared his first run for president, when even my own family thought he was nothing but a clown. This is why we must all study history as if our lives depended on it. Because they do.
Yeah, it feels like we're going to get AI continuously shoved down our throats. Not all uses of artificial intelligence are bad per se, but context matters, and particularly for large models, the environmental implications shouldn't be ignored.
Right. And powered by "clean" coal.
"GOD's coal."
As a teacher I have seen the use of AI by students increase dramatically in the past 2-3 years. Students are predominantly using this as a shortcut, not to deepen learning. I don’t blame them for doing this as the school day is a grind, with the majority of middle school and high school students carrying a load of 6-8 courses. That in itself is a problem, but this article is about the use of AI.
No technology is all good or all bad but I see more downside with the use of AI in schools and society as a whole. One of the reasons for the explosion in the acceptance of conspiracy theories is that smart phones have absolved us of taking responsibility for our knowledge( I am speaking to society as a whole and not every individual). We used to have to work for knowledge and therefore feel a sense of responsibility for what we said. Now people can just regurgitate whatever appears on their screen and cite that as “proof”. I think AI will make this problem worse.
Mike, you make a good point. I think smart phones are making us lazy, at the least, and probably stupider. And AI is the next step. When was the last time anyone had to memorize a phone number? I remember the days when I could recall a phone number or even the locker combination from my 7th grade locker (in the early 1970s). Now I don't even know most of my family members' phone numbers because I have an entire "phone book" in my hand/pocket.
Over use of a "smart phone," leaves you stupid.
Where to start? Let's start with China. If there was ever a country that wanted to put hundreds of millions of people into the most menial task, no-brain-needed workforce, that would be China. What an opportunity to have more control over people starting with the babes. Let them learn...up to a point, then let them rely on AI to think.
I retired from 25 years in a NYC high school about two minutes after AI was introduced to the populace. My timing was coincidental but it probably saved me from the frustration I would feel everyday fighting another machine and tech-driven push into education. Big tech has been trying for decades to figure out how to take that big pot of money that goes to educating our children and make it their money.
We already have a lazy populace and children learn to take the easy route. AI is not a tool for learning. It is very obviously a tool for unlearning, or dislearning or not learning.
Pope Leo (and Francis before him) has said that machine learning, while having great potential to help mankind, has more the potential to rob mankind of its dignity, especially the poor and the working class.
How nice and warm the water is, the froggies think as they lazily kick around the pot.
A retired English prof, I have seen first-hand that technology that gives students an easy way out - free essays online, research through general search engines that allow cut&paste info, paper downloads - quickly become their go-to. Their acceptance of cheating increased, seen as wrong only in that they got caught, not that cheating is wrong. Subsequently, this was followed by an inability to discern fact from opinion or misinformation, to think critically or to problem solve. Learning takes work and the exercise of the brain. I used to tell students to think of their brain as their thinking muscle, and just like other muscles, it weakens and atrophies with disuse. This is a grave threat to a democracy that relies on an informed and critically thinking electorate. It's also a grave threat to future jobs as more and more becomes automated and less person-to-person interaction occurs.
My daughter who goes to community college uses AI. She is autistic and has trouble with structured learning. As you may or may not know, today's school systems, for neurodivergent kids, teach to them in a manner that's akin to asking turtles to climb trees.
My daughter was raised in a new age, one that is in many ways inconceivable to me. I grew up in the 70s and 80s. When I played video games on my Coleco or Super Nintendo, I might spend weeks of frustration trying to figure out the next move in a strategy game like Legend of Zelda. There were no resources where I could look up answers. You just hoped you'd stumble onto the answer and maybe, sometimes, you just didn't. Ever. And gave up.
My daughter, when playing games in the late 00s and throughout the 2010s, simply went onto YouTube and watched a playthrough. Was she a cheater? Or was she resourceful? You could argue that she is less successful at problem solving than I was at her age and you might be right. To her, though, it was the natural solution. The technology was there. She utilized it. I couldn't prevent her from it if I wanted to unless I locked her in a fairytale tower.
I can't keep her from using AI now. She is an adult of voting age. What I can do and what her school has done, is encourage her to use it wisely. To never, ever copy and paste and to recognize that that is an unacceptable shortcut and that it is cheating. So I ask her to summarize and paraphrase as often as possible. Is it then her organic voice? No, not really. But it will probably never be that again for any young person who knows that the possibility of using AI is real and that it exists in the click of a few buttons.
What am I going to do? Insist that we go back to a quill, an inkwell and parchment by candlelight? We are in a whole new world. We must stop our handwringing and adapt *ourselves* and our old ways of thinking. Parenting and teaching are a whole new frontier for us. Who are we to say that we know better when it probably seems to our kids that we just don't know anything anymore. We no longer have the answers but we also don't even have the right questions yet. Welcome to technological advancement in the, as my teenage son says, gargantuan 2025.
Education which no longer educates children in order to develop their minds, but only spoon feeds them is not useful. As you already know, your daughter is a special case. What is helpful and useful for her, and should definitely be used in her case, would be destructive to other students if they were allowed to use it. Competent, well-managed educational systems are willing and able to make such allowances.
Yes, given how incurious and uncreative most Americans are, let’s find new ways to make us even more intellectually lazy.
I am a retired IT pro who still writes a lot of code. With AI’s help, I can write a new system in way less time than I could before (hooray!) but it’s often really crappy code that should be cleaned up (boo!). It’s especially helpful when the system requirements are an ancient or obscure technology since it can do the heavy research instantaneously.
It’s making me lazy.
That said:
AI isn’t infallible (everything here is anecdotal)
1) One time it included random code from a previous system; when I pointed this out, it said something akin to: “You’re right! I was told to totally forget all previous work but I didn’t.”
2) AI code sometimes contains hidden bugs only found during rigorous testing. This takes time
3) AI produces working code I do not understand and/or is very poorly written that should be studied and cleaned up. This takes time
4) (I am not aware of this happening yet, but can easily imagine it): AI could be taught to include intentional security holes. Critical code should be thoroughly checked. This takes time.
Testing and cleanup are arguably the least fun part of programming, and are often the first to fall to the corporate budgetary axe as deadlines approach. Are we comfortable with that?
So it still does require human oversight, is what you're saying, but that isn't the fun part and it sucks?
Yes, it still needs human oversight. You still need to know exactly how to request. AI is like a 2 year old (or maybe a teenager) that does EXACTLY what you say except for the times that it doesn't. As a programmer, I personally find writing the code the best part; testing, fixing, and cleanup is more of a chore. I wouldn't say it sucks, but it is less fun.
Will AI eventually replace me? My guess is "not-completely": I will still be needed when something goes pear-shaped, not unlike rescuing the Roomba when it gets caught under the sofa.
How does AI generate code affect regression testing? (Used to be an IT tech writer).
If I was still tech writing, I think I would find AI helpful for explaining concepts and tech I had to learn to do my job. Sometimes the learning was like a fire hose and didn't lead to the deep understanding needed to write docs and manuals. I'd still be triple checking though.
Good question! While I use AI to write my code, I still manually write up testing checklists.
Ah yes Make America Stupid Again
It's worth the time to read these studies. I just took a quick look.
That said, I can't imagine why there is support from schools to let students write using AI. Learning to write is fundamental to a liberal education. There are so many things in everyday life that require an understanding of how a subject is presented, why the information is meaningful and on and on. Knowing how to do something for yourself adds real value.
Other than profit, I don't get the push to apply AI to learning.
Your third sentence explains it perfectly.
Thanks so much for this thoughtful piece. I see 2 additional significant issues: students believe AI is always correct, and the industry/tech isn't sustainable. I thought this piece summed that problem up well, but i would love to know if others see the same issue. Are the same people running these AI companies the sam people who caused the dot com bust? https://www.wheresyoured.at/wheres-the-money/
Another possible direction increased use of AI may bring is reduction in brick-and-mortar schools and physical teachers. We have to ask, if an AI based education system is implemented, at what point will teachers become unnecessary? The student will be guided by an AI assistant with a physical teacher relegated to monitoring only. Will students need to be in a physical classroom? The concern with AI is valid; our brains develop better if one has to develop critical thinking skills without assistance. AI will lessen that critical skill as students begin to depend more on AI and less on their own ability.
If you've ever actually taught in a school, you realize (as the COVID shut downs have shown) that at least half of the purpose of school is allowing students to learn to interact with a wide variety of different people in positive ways while building your own sense of self and self confidence in such interactions with a good part of the job of school staff being to put guardrails around all that so that certain norms of behavior and interaction are kept in place.
Moving all education to individually siloed, AI settings will lead us to a society which is not a society at all; a society which is rendered increasingly dysfunctional because the need we humans have for interaction, especially in our adolescent years, will go unfulfilled, leading to increasing levels of depression and anxiety among our youngsters and non existent birth rates.
I agree and would opine that, with the advent of social media, your concerns already appear to be presenting themselves. Kids together in a group, all looking at their phones, is not my idea of healthy social interaction.
My kids both despise online learning. It's far less engaging than actual school. Have any of the article's authors or the commenters here actually spoken to a kid lately? This is all a bunch of adults lamenting how awful AI is and what the studies say about cognitive this and critical thinking that.
What do the students say? My kids hates his online school's crappy AI learning platform. He can't wait to go back to a brick and mortar school with his friends.
Do you think the kids think they're lazy? I assure you, school is still hard AF. Much, much moreso for neurodivergent kids who are finally being being recognized now that the diagnostic criteria for ADHD and autism has dramatically improved in recent years.
Why don't we ask student how they feel about it? I'll bet they aren't as impressed with AI or as lazy and uninspired as a lot of older people seem to think they are.
I would agree as I, in college, really hated online classes. To me they were disruptive and inefficient in that actual idea exchange was cumbersome and often limited by who could type faster. As an older student I couldn't keep up nor could I decipher the language shortcuts familiar to young kids.
another way to "brainwash" America to make it easier for a complete takeover
Depriving generations of the joy of discovery through analysis and deep thought.
The tech companies are shoving AI on us. Meta AI asked the other day if I’d like AI to summarize my WhatsApp messages. Uh, no, I want to read the actual messages from friends and colleagues. No, CoPilot, I don’t want your help and I don’t want you to be the first thing I see, slowing access to the documents I need. My college freshman affirms that AI is used regularly for cheating. The NYT had an AI video quiz last week - very difficult to discern real from bs. Scary and stupid.
Just say "no." AI is as bad for your brain as the worst illicit drug, but far more subtle in its effect.
i'm surprised trump is for this......after all his parents had to pay money for all those essays he submitted. why should his kid and grandkids get it for "free"
Trump doesn't understand any of this, but his handlers are telling him that AI can be made to make Trump look like the most brilliant president ever, so he's all for it.
This has been my greatest concern since the beginning of AI discussions. I feared it would lover the level of the human brain in its abiolity to think, to learn, to reason, etc. It is HORRIBLE and it's all about a fight for who can profit MOST by the development of their AI business/marketing.
And of course, the amount of energy needed to run AI is off the rails, so harmful and damaging. What can be done though, aside from sharing wonderful articles and research like your own? I am SO SO Concerned about waht it will do to young minds still finding their way to being literate, smart, thinkers.