> At its heart, education is a project of guiding learners to exercise their own agency in the world. Through education, learners should be empowered to participate meaningfully in society, industry, and the planet.
I agree but I have never seen an education system that had this as a goal. It's also not what society or employers actually want. What is desired are drones / automatons that take orders and never push back. Teaching people about agency is the opposite of that.
We are so stuck in a 19th century factory mindset everywhere, GenAI is just making it even more obvious.
jjmarr · 1d ago
Employers want a high-agency leadership class and drones for the individual contributors.
There are systems that nurture agency and leadership. They are the private schools and the Ivy League universities. And many great companies.
Most people don't want to be leaders and be judged based on impact. They want to be judged based on effort. They followed all the rules when writing their essay and should get an A+ even though their essay is unconvincing. If they get a bad mark, their response is to create a petition instead of fixing the problems.
Maybe we should attack our culture of busywork and stop blaming educators for failing to nurture agency.
em-bee · 1d ago
the education i received in germany did have this goal. the teachers had this goal, and i have the impression that the teachers and schools my kids go to have this goal as well. i can't say how universal that is, but it the opposite is not universal either.
the problem is that the goals are not effectively implemented. maybe it's more a dream than a goal, because the teachers and schools don't know how to actually reach that goal.
meaningful participation in society is often reduced to the ability to get a job by those outside of school, so you are right about employers. at least the large ones. unfortunately that works against them, because the current generation of juniors doesn't even want to learn anything. they are drones that just want to get paid, but are not motivated to learn what they need to do their job better.
x3qt · 1d ago
Just yesterday, I talked to a neighbor who has two kids attending a local school in Mitte. He told me that the children are constantly indoctrinated into group conformity, obedience to authority, and fear of "wrong-think," with a good splash of wokie-talkie on top of it. To me, that sounds like a complete erasure of agency. Schools must provide knowledge, not override the nurture given by parents.
I have personally observed how locals are bullied by overseas guests and choose a delusional escape into virtue signaling rather than defending themselves. I consider German upbringing to be that of a defeated people.
em-bee · 22h ago
I consider German upbringing to be that of a defeated people
i don't know what you are trying to imply here. how should the feeling of defeat affect the upbringing? (i mean,i am sure there would be an effect, but how would that look like?)
what i can tell you is that the sentiment i experienced was not defeat. after all this is neither our, nor our parents, (and for the current generation also not their grandparents) experience. the feeling we were taught was that of embarrassment, of how could we let that happen and consequently the need to understand how we can avoid that from ever happening again. except for a minority or right wing sympathizers that we keep a close eye on.
x3qt · 21h ago
I think that the Allied victors laid the foundation of the current German education system on initial denazification and subsequent extreme pacification, to such a degree of impotence that people refuse to defend themselves even when they are fully capable of neutralizing a criminal, preferring to become victims rather than use force.
em-bee · 21h ago
i don't have this feeling at all. on what experience do you base that on?
x3qt · 20h ago
I’ve seen multiple instances of robberies where the attacker was a head shorter and could have been easily stunned, or worse, with a single hit, yet people gave away their valuables because even the thought of using violence is taboo. Of course, the police always say, “Just file a complaint,” which never results in anything. It’s not a joke: even if violence is used purely to stop a criminal, the police will prosecute you, lol. I’m not American, but I like the idea that one could defend themselves and their property using all means necessary.
scarecrowbob · 1d ago
While you are likely correct about systems, I have known quite a few individual educators who have the goal of helping their fellow humans learn about their agency in the world.
TheNewsIsHere · 1d ago
I attended a public school system which, while at times did falter in various ways, did a fairly good job meeting its stated mission that was more or less exactly that.
I witnessed far more personal political pressure and cajoling than corporate/future employer. Where I went to school the pressure on schools was usually from parents, students, and local groups concerned with civil matters. I had (until recently) indirect (and sometimes direct) exposure to this because one of parents was an educator and a senior member of their department in an adjoining district to the one I attended.
Where I went to college, it was always very clear to me what was shaped by industry vs. research and academia. I went to a research university for an uncommon hard-science degree and so there was a lot of employer interest, but the university cleverly drew a paywall around that and businesses had to pay the university to conduct research (or agree to profit sharing, patent licensing, etc). There was a clear, bright line separating corporate/employer interest from the classroom.
Neal Koblitz's "The Case Against Computers in Math Education".
EMIRELADERO · 1d ago
Wow. Now there's a quote:
> "Youngsters who are immersed in this popular culture are accustomed to large doses of passive, visual entertainment. They tend to develop a short attention span, and expect immediate gratification. They are usually ill equipped to study mathematics, because they lack patience, self-discipline, the ability to concentrate for long periods, and reading comprehension and writing skills."
For context, the essay is from 1996. You could have told me this is from the current year and I would have believed you.
ultrarunner · 1d ago
> You could have told me this is from the current year and I would have believed you.
Agreed. It's a matter of degree, and I wonder what reaching the eventual limit (if there is one) looks like.
bombcar · 1d ago
There’s a platonic dialogue that has basically the same sentiment.
eikenberry · 22h ago
People see what they want to see, even very smart people.
ReDeiPirati · 1d ago
Ultimately those are tools and I think the goal is to educate students to use them properly. Also because I don't expect the knowledge paradox to disappear anytime soon with these models.
atleastoptimal · 1d ago
The cat is out of the bag. Kids will use AI to write papers, learn topics, cheat on take-home tests, etc. Only a completely closed-off environment with no access to the internet could prevent this.
The best option is to change the incentives. 95% of kids treat school as a necessary hurdle to enter the gentry white-collar class. Either make the incentives personal enrichment instead of letter grades or continue to give students every incentive to use AI at every opportunity.
sillystu04 · 1d ago
> Only a completely closed-off environment with no access to the internet could prevent this.
Even then, an LLM running locally could still operate.
const_cast · 18h ago
> Only a completely closed-off environment with no access to the internet could prevent this.
Okay, then we should do this.
> Either make the incentives personal enrichment instead of letter grades
This just straight up does not work.
The incentive for not being obese is perhaps the most perfect incentive ever: you live a happier life, with a greater quality of life, for longer, with less societal friction. It's the perfect poster child of "personal enrichment".
And yet, obesity is not declining. How is this possible?
Because internal locus of control as a "solution" for systemic issues just does not work. It doesn't maybe work, it doesn't sometimes work, it never works. If you don't address institutional issues and physiological issues then you're never going to find a solution.
What I mean is, kids use AI because it's easy. It's human nature to take the path of least resistance. This has a physiological, a biological, component to it. If we're just going to be waiting around for the day people aren't lazy then we're all gonna die.
Schools are artificial environments by design. They're controlled environments by design. If we leave children to their own devices, they grow up stupid.
The problem is that education is a cumulative endeavor. We don't give calculators to kindergartners trying to learn the number line. Why not? Because if you don't have the neural connections to intuitively, and quickly, understand the number line, then Algebra is going to be a nightmare.
AI can enhance learning, if and only if the prerequisites are satisfied. If you use AI to write but you don't know how to write, then you're going to progress on and struggle much more than you should. We carefully and deliberately introduce tools to children. Here's your graphing calculator... in Algebra I, after you've already graphed on paper hundreds of times. You already understand graphing, great, now you're allowed to speed it up.
We, as adults, are very far removed from this. We have an attitude of "what's the problem" because we already have built those neural connections. It's a sort of Lord Farquad "some of you may die, but that's a risk I'm willing to take" approach, but we don't even realize we do it.
xnx · 1d ago
> GenAI is a threat to student learning and wellbeing.
This blanket dismissal is not going to age well, and reads like a profession lashing out.
With the right system prompt, AI can be a patient, understanding, encouraging, non-judgemental tutor that adapts and moves at the student's pace. Most students can not afford that type of human tutor, but an AI one could be free or very affordable.
> AI can be a patient, understanding, encouraging, non-judgemental tutor
Groan... no it can't. It can simulate all those things, but at the moment, "AI" can't be patient, understanding, and whether judgemental or non-judgemental.
OK it can be encouraging. "You're one good student, $STUDENT_NAME!" (1).
Whether the AI is patient, understanding, etc., is entirely up to the person interacting with it to decide. Just like they decide this when interacting with people. You can never know the internal state of the other in a conversation so it is up to you to model it and if modeling it is best done with human metaphors then use human metaphors.
abletonlive · 1d ago
I can say the exact same thing about you or anybody else. You can’t be patient, understanding, encouraging, non-judgmental tutor. You can only simulate it.
I really can’t understand why people don’t understand this. What am I missing?
binary132 · 1d ago
Philosophical zombies are supposed to be a thought experiment to demonstrate that solipsism and nihilism are stupid, not a rhetorical device to equate human minds to linear algebra statistical parrots.
netsharc · 1d ago
Geezus freaking christ.
Now is that a simulation of someone who thinks he's responding to a cretin... or actually the feelings of someone who thinks he's talking to a cretin?
No comments yet
avmich · 1d ago
Most student can not afford the expertise necessary to have AI patient etc.
I think the original phrase was made with the assumption "as it is right now".
I do share concerns of undersigned, even though don't necessarily agree with all statements in the letter.
ktallett · 1d ago
If you are using the most commonly available AI and have an average ability of perfecting a search term, right now AI is not a particularly useful tool in learning anything. It is far too inaccurate to learn anything challenging. The key term here is could, and yes it is possible but there is nothing yet to say we shall get there.
Loughla · 1d ago
My experience in higher education is that students use AI for one of two things:
1. To do the homework because they view classes and grades as a barrier to their future instead of preparation for such.
2. In place of a well crafted query in an academic database.
happytoexplain · 1d ago
It's not a blanket dismissal, it's a fact in context. It should read like a profession lashing out - that's what it is.
AI has enormous upsides and enormous downsides. The "you're going to look so dumb in the future" dismissal is lazy. Inevitability does not make something purely beneficial.
It's a fallacious line of thinking that's disappointingly common in tech-minded people (frequently seen in partnership with implications that Luddites were bad or stupid, quotes from historical criticisms of computers/calculators, and other immature usage of metaphor).
xnx · 1d ago
I'd respect the statement more if it acknowledged that AI had some benefit, or potential benefit in the future, but they did not want to use it currently.
lawlessone · 1d ago
Maybe if we move from LLMS to real AI it will have benefits.
bshepard · 1d ago
"You have not discovered a potion for remembering, but for reminding; you provide your students with the appearance of wisdom, not with its reality. Your invention will enable them to hear many things without being properly taught, and they will imagine that they have come to know much while for the most part they will know nothing. And they will be difficult to get along with, since they will merely appear to be wise instead of really being so.” -- someone wise, or was he?
amelius · 1d ago
Makes sense. You also don't give calculators to students of arithmetic.
stephen_g · 1d ago
Sarcasm? We actually weren't allowed to take any kind of calculator into any of our advanced maths exams in University (and I'm talking just 15 years ago, not when they were newfangled things).
DavidPiper · 1d ago
(Theses days) it's hard to know what you mean by this and whether you're being sarcastic.
No you don't give arithmetic students calculators for their exams, and you expect them to know how to do it without one.
Yes you probably give professionals who need to do arithmetic calculators so they can do it faster and with less errors.
Giving calculators to people who don't know how, why and/or when to use them will still get you bad results.
Giving calculators to someone who doesn't have any use for one is at best a waste of money and at worst a huge waste of time if the recipient becomes addicted to calculator games.
izacus · 1d ago
The person you're responding to has clearly used the word "student". What on earth are you on about?
DavidPiper · 1d ago
I interpreted "students of arithmetic" as anyone that practices arithmetic - similar to "students of medicine", etc.
No comments yet
mulmen · 1d ago
Seems like a reasonable expansion of the concept to me. Why the aggressive dismissal?
mulmen · 1d ago
Can’t tell if you are serious but I will assume you are.
Why not? Seems like a logical conclusion.
1. Introduce the concept.
2. Demonstrate an intuitive algorithm.
3. Assist students as they practice and internalize the algorithm.
4. Reinforce this learning by encouraging them to teach each other.
5. Show them how to use tools by repeating this process with the tool as the concept.
darth_avocado · 1d ago
You want to limit the use of AI in schools just the way you want to limit calculators: ensure the student can do the math without calculators, even when the computation is hard and then teach them to use the calculator as a tool to help them move faster.
Restricting AI completely or introducing it too early, both would be harmful.
mulmen · 1d ago
I'm not really convinced. This sounds reasonable but I can't formulate a good argument in favor.
shminge · 14h ago
One of my favourite quotes on this topic:
> Using ChatGPT to write an essay is a bit like using a forklift to lift weights. The forklift might do a perfectly good job of moving around some heavy iron plates, but you’d be wasting your time.
The point of writing essays (or doing any other school assessment) is not the completed product, it's the work (and hopefully learning) that went into it.
You can definitely use AI responsibly, but many students will not and do not.
KevinMS · 1d ago
I'm not the biggest fan of AI for everything but you couldn't create something more of a dagger to the heart of the current education system. If you are in the U.S., carefully watch for the D party to turn on AI in their messaging and you'll be witness to the strong influence that teachers unions have on them. Disagree me all you want, but keep your eyes open, I guarantee you'll see it soon.
sfpotter · 1d ago
Interesting thought but my impression is that the democrats are much more beholden to other forces at play in the school system. I have friends who are teachers in the public school system, have been active in the union, and are indeed against AI in the classroom (although they're hardly rabid or unreasonable about it). On the other hand, the school administrators and IT departments are much more aggressive about pushing AI on them and pressing them to work it into the classroom somehow. Considering that the democrats are largely captured by corporate interest, and considering that tech/AI is one of the biggest corporate interests there is right now... I just don't see things playing out the way you predict.
KevinMS · 1d ago
the administrators and IT departments are not in the teachers unions.
sfpotter · 1d ago
Yes... exactly my point.
KevinMS · 1d ago
and the teachers unions have vastly more power than those guys
sfpotter · 1d ago
Hey, I'm sorry, but such a blanket statement is pretty weak on its own. I'm interested in your perspective. Can you provide some concrete details that support your point? Because the people I know feel like AI in the classroom is inevitable and that they don't have much power in the face of the authority that wants to impose it on them, which would seem to contradict what you're saying.
thinkingtoilet · 1d ago
Every teacher I talked to has said the influence of AI has been negative. Why wouldn't they fight to remove it from the classroom?
KevinMS · 1d ago
They are talking about cheating with it, not replacing teaching with it.
neurostimulant · 1d ago
Should we teach our kids to outsource their thinking to those genai services where the big clouds control the gate? It would be less of an issue if local genai with comparable capability is more accessible to general public.
penguin_booze · 1d ago
Just in: new tariffs have been announced on educators. Not sure on whom, but there it is.
thedevilslawyer · 1d ago
> Further, GenAI adoption in industry is overwhelmingly aimed at automating and replacing human effort, often with the expectation that future “AGI” will render human intellectual and creative labor obsolete. This is a narrative we will not participate in
When every learner gets the high quality support and tutoring they need, all around the world, then we can talk about what you're unwilling to participate in. Until then, may every learner get a fantastic tutor via GenAI.
thedevilslawyer · 1d ago
Also,
>global community
As long as global means rich. 0 signatories from China, India, Russia, Pakistan, Bangladesh, Indonesia, Africa.
luqtas · 1d ago
do you think those have access to computers with AI for their education?
thedevilslawyer · 1d ago
Yes.
123yawaworht456 · 1d ago
Previously: An open letter from educators who refuse the call to adopt [printed books, ballpoint pens, calculators, computers, the internet] in education
shminge · 14h ago
There's a big difference between "Here's this tool that helps you think" (ie calculator or pen) and "Here's this tool that does the thinking for you". And before you say that AI can fall under the first option, plenty of schoolchildren will take the easy way out and not use it responsibly.
andy99 · 1d ago
> Current GenAI technologies represent unacceptable legal, ethical and environmental harms, including exploitative labour, piracy of countless creators' and artists' work, harmful biases, mass production of misinformation, and reversal of the global emissions reduction trajectory.
It's really annoying that political stuff always pollutes things. I largely agree with the position about GenAI being bad for education, but that position is not strengthened by tacking on a bunch of political drivel.
JKCalhoun · 1d ago
Whether you agree or disagree, I am happy to see a community putting out (in writing even) their problems with AI as it exists.
To the degree it is possible I would like to think the AI community would try to address their issues.
I understand that some of the items in their open letter show a complete incompatibility with AI — period. But misinformation, harmful biases, energy resource use should be things we all want to improve.
jacknews · 1d ago
I don't think resource use is any business of teachers to be honest.
The problem with AI currently is that the students have figured out how to use it to cheat, but the teachers haven't figured out how to use it to teach.
AI is here, we need to figure out how to use it effectively and responsibly. Schools should be leading on this, instead of putting their heads in the sand and hoping it goes away.
akomtu · 1d ago
AI is turning into a cult that's dividing us into those who support it and those who reject it. Arguments on both sides are flimsy, as no one really understands what it is. People see it as a black-box magic crystal.
nineplay · 1d ago
I find this all-or-nothing attitude extraordinary. Chatbots are the best personal tutors you'll ever find and I tell students so. Do you need to understand Mitosis for Bio 101? Ask your favorite chatbot. Then ask what daughter cells are - a question you might be too afraid to ask in class because maybe it was covered yesterday you weren't listening. Then ask why there are no "son" cells - which you'd also be to afraid to ask about in class but you want to know.
You can ask every dumb question. You can ask for clarification on every term you don't understand. You can go off on tangents. You can ask the same thing again ten minutes later because you forgot already.
No teacher or tutor or peer is going to answer you with the same patience and the same depth and the same lack of judgement.
Is it good enough for a grad student working on their thesis? Maybe not. Is it good enough for a high school student. Almost certainly. Does it give this high school student a way to better _really_ understand biology because they can keep asking questions until they start to understand the answers. I think absolutely.
doctorpangloss · 1d ago
There is no ethical generative AI. Meaning fully permissioned datasets, end-to-end. Not yet scientifically possible. So 100%, everyone who claims this, is lying, usually by omission, and some BS startup isn't going to invent this.
In my open letter, I wouldn't say "ethical" or "environmental" or any of these intersectional things because you're giving space for lies.
People want ethical AI even if it's impossible. So we get aspirationally ethical AI. Meaning, people really want to use generative AI, it makes life so easy, and people also want it to be ethical, because they don't want to make others upset, so they will buy into a memetic story that it is "ethical." Even if that story isn't true.
Aspirationally ethics already got hundreds of millions of dollars in funding. Like look at generative AI in the media industry. Moonvalley - "FULLY LICENSED, COMMERCIAL SAFE" (https://www.moonvalley.com) - and yet, what content was their text encoder trained on? Not "fully licensed," no not at all. Does everything else they make work without a text encoder? No. So... But people really want to believe in this. And it's led by DeepMind people! Adobe has the same problem. Some efforts are extremely well meaning. But everyone claiming expressly licensed / permissioned datasets is telling a lie by omission.
It's not possible to have only permissioned data. Anthropic and OpenAI concede, there's no technology without scraping. Listen, they're telling the truth.
ACCount36 · 1d ago
I loathe this entire line of "ethical" moral grandstanding.
AI should be trained on all data that is available. For a significant part of the dataset, it's the most useful that data has ever been.
No comments yet
sergiomattei · 1d ago
This is absurd. I’ve learned so much from having an LLM tutor me as a I go through a dense book, for example.
I agree but I have never seen an education system that had this as a goal. It's also not what society or employers actually want. What is desired are drones / automatons that take orders and never push back. Teaching people about agency is the opposite of that.
We are so stuck in a 19th century factory mindset everywhere, GenAI is just making it even more obvious.
There are systems that nurture agency and leadership. They are the private schools and the Ivy League universities. And many great companies.
Most people don't want to be leaders and be judged based on impact. They want to be judged based on effort. They followed all the rules when writing their essay and should get an A+ even though their essay is unconvincing. If they get a bad mark, their response is to create a petition instead of fixing the problems.
Maybe we should attack our culture of busywork and stop blaming educators for failing to nurture agency.
the problem is that the goals are not effectively implemented. maybe it's more a dream than a goal, because the teachers and schools don't know how to actually reach that goal.
meaningful participation in society is often reduced to the ability to get a job by those outside of school, so you are right about employers. at least the large ones. unfortunately that works against them, because the current generation of juniors doesn't even want to learn anything. they are drones that just want to get paid, but are not motivated to learn what they need to do their job better.
I have personally observed how locals are bullied by overseas guests and choose a delusional escape into virtue signaling rather than defending themselves. I consider German upbringing to be that of a defeated people.
i don't know what you are trying to imply here. how should the feeling of defeat affect the upbringing? (i mean,i am sure there would be an effect, but how would that look like?)
what i can tell you is that the sentiment i experienced was not defeat. after all this is neither our, nor our parents, (and for the current generation also not their grandparents) experience. the feeling we were taught was that of embarrassment, of how could we let that happen and consequently the need to understand how we can avoid that from ever happening again. except for a minority or right wing sympathizers that we keep a close eye on.
I witnessed far more personal political pressure and cajoling than corporate/future employer. Where I went to school the pressure on schools was usually from parents, students, and local groups concerned with civil matters. I had (until recently) indirect (and sometimes direct) exposure to this because one of parents was an educator and a senior member of their department in an adjoining district to the one I attended.
Where I went to college, it was always very clear to me what was shaped by industry vs. research and academia. I went to a research university for an uncommon hard-science degree and so there was a lot of employer interest, but the university cleverly drew a paywall around that and businesses had to pay the university to conduct research (or agree to profit sharing, patent licensing, etc). There was a clear, bright line separating corporate/employer interest from the classroom.
Neal Koblitz's "The Case Against Computers in Math Education".
> "Youngsters who are immersed in this popular culture are accustomed to large doses of passive, visual entertainment. They tend to develop a short attention span, and expect immediate gratification. They are usually ill equipped to study mathematics, because they lack patience, self-discipline, the ability to concentrate for long periods, and reading comprehension and writing skills."
For context, the essay is from 1996. You could have told me this is from the current year and I would have believed you.
Agreed. It's a matter of degree, and I wonder what reaching the eventual limit (if there is one) looks like.
The best option is to change the incentives. 95% of kids treat school as a necessary hurdle to enter the gentry white-collar class. Either make the incentives personal enrichment instead of letter grades or continue to give students every incentive to use AI at every opportunity.
Even then, an LLM running locally could still operate.
Okay, then we should do this.
> Either make the incentives personal enrichment instead of letter grades
This just straight up does not work.
The incentive for not being obese is perhaps the most perfect incentive ever: you live a happier life, with a greater quality of life, for longer, with less societal friction. It's the perfect poster child of "personal enrichment".
And yet, obesity is not declining. How is this possible?
Because internal locus of control as a "solution" for systemic issues just does not work. It doesn't maybe work, it doesn't sometimes work, it never works. If you don't address institutional issues and physiological issues then you're never going to find a solution.
What I mean is, kids use AI because it's easy. It's human nature to take the path of least resistance. This has a physiological, a biological, component to it. If we're just going to be waiting around for the day people aren't lazy then we're all gonna die.
Schools are artificial environments by design. They're controlled environments by design. If we leave children to their own devices, they grow up stupid.
The problem is that education is a cumulative endeavor. We don't give calculators to kindergartners trying to learn the number line. Why not? Because if you don't have the neural connections to intuitively, and quickly, understand the number line, then Algebra is going to be a nightmare.
AI can enhance learning, if and only if the prerequisites are satisfied. If you use AI to write but you don't know how to write, then you're going to progress on and struggle much more than you should. We carefully and deliberately introduce tools to children. Here's your graphing calculator... in Algebra I, after you've already graphed on paper hundreds of times. You already understand graphing, great, now you're allowed to speed it up.
We, as adults, are very far removed from this. We have an attitude of "what's the problem" because we already have built those neural connections. It's a sort of Lord Farquad "some of you may die, but that's a risk I'm willing to take" approach, but we don't even realize we do it.
This blanket dismissal is not going to age well, and reads like a profession lashing out.
With the right system prompt, AI can be a patient, understanding, encouraging, non-judgemental tutor that adapts and moves at the student's pace. Most students can not afford that type of human tutor, but an AI one could be free or very affordable.
"How AI Could Save (Not Destroy) Education" (https://www.youtube.com/watch?v=hJP5GqnTrNo) from Sal Khan of Khan Academy
Groan... no it can't. It can simulate all those things, but at the moment, "AI" can't be patient, understanding, and whether judgemental or non-judgemental.
OK it can be encouraging. "You're one good student, $STUDENT_NAME!" (1).
1) https://www.youtube.com/watch?v=jRPPdm09xZ8
I really can’t understand why people don’t understand this. What am I missing?
Now is that a simulation of someone who thinks he's responding to a cretin... or actually the feelings of someone who thinks he's talking to a cretin?
No comments yet
I think the original phrase was made with the assumption "as it is right now".
I do share concerns of undersigned, even though don't necessarily agree with all statements in the letter.
1. To do the homework because they view classes and grades as a barrier to their future instead of preparation for such.
2. In place of a well crafted query in an academic database.
AI has enormous upsides and enormous downsides. The "you're going to look so dumb in the future" dismissal is lazy. Inevitability does not make something purely beneficial.
It's a fallacious line of thinking that's disappointingly common in tech-minded people (frequently seen in partnership with implications that Luddites were bad or stupid, quotes from historical criticisms of computers/calculators, and other immature usage of metaphor).
No you don't give arithmetic students calculators for their exams, and you expect them to know how to do it without one.
Yes you probably give professionals who need to do arithmetic calculators so they can do it faster and with less errors.
Giving calculators to people who don't know how, why and/or when to use them will still get you bad results.
Giving calculators to someone who doesn't have any use for one is at best a waste of money and at worst a huge waste of time if the recipient becomes addicted to calculator games.
No comments yet
Why not? Seems like a logical conclusion.
1. Introduce the concept.
2. Demonstrate an intuitive algorithm.
3. Assist students as they practice and internalize the algorithm.
4. Reinforce this learning by encouraging them to teach each other.
5. Show them how to use tools by repeating this process with the tool as the concept.
Restricting AI completely or introducing it too early, both would be harmful.
> Using ChatGPT to write an essay is a bit like using a forklift to lift weights. The forklift might do a perfectly good job of moving around some heavy iron plates, but you’d be wasting your time.
The point of writing essays (or doing any other school assessment) is not the completed product, it's the work (and hopefully learning) that went into it.
You can definitely use AI responsibly, but many students will not and do not.
When every learner gets the high quality support and tutoring they need, all around the world, then we can talk about what you're unwilling to participate in. Until then, may every learner get a fantastic tutor via GenAI.
>global community
As long as global means rich. 0 signatories from China, India, Russia, Pakistan, Bangladesh, Indonesia, Africa.
It's really annoying that political stuff always pollutes things. I largely agree with the position about GenAI being bad for education, but that position is not strengthened by tacking on a bunch of political drivel.
To the degree it is possible I would like to think the AI community would try to address their issues.
I understand that some of the items in their open letter show a complete incompatibility with AI — period. But misinformation, harmful biases, energy resource use should be things we all want to improve.
The problem with AI currently is that the students have figured out how to use it to cheat, but the teachers haven't figured out how to use it to teach.
AI is here, we need to figure out how to use it effectively and responsibly. Schools should be leading on this, instead of putting their heads in the sand and hoping it goes away.
You can ask every dumb question. You can ask for clarification on every term you don't understand. You can go off on tangents. You can ask the same thing again ten minutes later because you forgot already.
No teacher or tutor or peer is going to answer you with the same patience and the same depth and the same lack of judgement.
Is it good enough for a grad student working on their thesis? Maybe not. Is it good enough for a high school student. Almost certainly. Does it give this high school student a way to better _really_ understand biology because they can keep asking questions until they start to understand the answers. I think absolutely.
In my open letter, I wouldn't say "ethical" or "environmental" or any of these intersectional things because you're giving space for lies.
People want ethical AI even if it's impossible. So we get aspirationally ethical AI. Meaning, people really want to use generative AI, it makes life so easy, and people also want it to be ethical, because they don't want to make others upset, so they will buy into a memetic story that it is "ethical." Even if that story isn't true.
Aspirationally ethics already got hundreds of millions of dollars in funding. Like look at generative AI in the media industry. Moonvalley - "FULLY LICENSED, COMMERCIAL SAFE" (https://www.moonvalley.com) - and yet, what content was their text encoder trained on? Not "fully licensed," no not at all. Does everything else they make work without a text encoder? No. So... But people really want to believe in this. And it's led by DeepMind people! Adobe has the same problem. Some efforts are extremely well meaning. But everyone claiming expressly licensed / permissioned datasets is telling a lie by omission.
It's not possible to have only permissioned data. Anthropic and OpenAI concede, there's no technology without scraping. Listen, they're telling the truth.
AI should be trained on all data that is available. For a significant part of the dataset, it's the most useful that data has ever been.
No comments yet