> I should hope that the purpose of a class writing exercise is not to create an artifact of text but force the student to think; a language model produces the former, not the latter.
It's been incredibly blackpilling seeing how many intelligent professionals and academics don't understand this, especially in education and academia.
They see work as the mere production of output, without ever thinking about how that work builds knowledge and skills and experience.
Students who know least of all and don't understand the purpose of writing or problem solving or the limitations of LLMs are currently wasting years of their lives letting LLMs pull them along as they cheat themselves out of an education, sometimes spending hundreds of thousands of dollars to let their brains atrophy only to get a piece of paper and face the real world where problems get massively more open-ended and LLMs massively decline in meeting the required quality of problem solving.
Anyone who actually struggles to solve problems and learn themselves is going to have massive advantages in the long term.
fallinditch · 54m ago
It's been obvious since ChatGPT blew up in early 2023 that educators had to rethink how they educate.
I agree that this situation that the author outlines is unsatisfactory but it's mostly the fault of the education system (and by extension the post author). With a class writing exercise like the author describes, of course the students are going to use an LLM, they would be stupid not to if their classmates are using it.
The onus should be on the educators to reframe how they teach and how they test. It's strange how the author can't see this.
Universities and schools must change how they do things with respect to AI, otherwise they are failing the students. I am aware that AI has many potential and actual problems for society but AI, if embraced correctly, also has the potential to transform the educational experience in positive ways.
EvgeniyZh · 3m ago
> they would be stupid not to if their classmates are using it.
Why would they be stupid? Were people before LLMs stupid for not asking smarter classmate/parent/paid contractor to solve the homework for them?
Large part of education is learning about things that can be easily automated, because you can't learn hard things without learning easy things. Nothing conceptually changed in this regard, like Wolfram Alpha didn't change the way differentiation is taught.
easygenes · 38m ago
Amusingly, when I asked o3 to propose changes to the education system which address the author's complaints wrt writing assignments, one of the first things it suggested was transparent prompt logging (basically what the author proposes).
It's been incredibly blackpilling seeing how many intelligent professionals and academics don't understand this, especially in education and academia.
They see work as the mere production of output, without ever thinking about how that work builds knowledge and skills and experience.
Students who know least of all and don't understand the purpose of writing or problem solving or the limitations of LLMs are currently wasting years of their lives letting LLMs pull them along as they cheat themselves out of an education, sometimes spending hundreds of thousands of dollars to let their brains atrophy only to get a piece of paper and face the real world where problems get massively more open-ended and LLMs massively decline in meeting the required quality of problem solving.
Anyone who actually struggles to solve problems and learn themselves is going to have massive advantages in the long term.
I agree that this situation that the author outlines is unsatisfactory but it's mostly the fault of the education system (and by extension the post author). With a class writing exercise like the author describes, of course the students are going to use an LLM, they would be stupid not to if their classmates are using it.
The onus should be on the educators to reframe how they teach and how they test. It's strange how the author can't see this.
Universities and schools must change how they do things with respect to AI, otherwise they are failing the students. I am aware that AI has many potential and actual problems for society but AI, if embraced correctly, also has the potential to transform the educational experience in positive ways.
Why would they be stupid? Were people before LLMs stupid for not asking smarter classmate/parent/paid contractor to solve the homework for them?
Large part of education is learning about things that can be easily automated, because you can't learn hard things without learning easy things. Nothing conceptually changed in this regard, like Wolfram Alpha didn't change the way differentiation is taught.
https://chatgpt.com/share/6817fe76-973c-8011-acf3-ef3138c144...
> Universities and schools must change how they do things with respect to AI, otherwise they are failing the students.
Hard disagree.
Students need to answer a fundamental question of themselves;
If it is the former, the latter doesn't really matter.If it is the latter, the former was not the point to begin with.