Students are outsourcing thought to ChatGPT — here's why educators should worry, a business professor warns (www.businessinsider.com)

🤖 AI Summary
Northumbria University business professor Kimberley Hardcastle warns that students are increasingly outsourcing core learning tasks to generative AI like ChatGPT and Anthropic’s Claude — not merely plagiarizing but delegating the thinking itself. Anthropic analyzed roughly one million anonymized conversations over 18 days and, after filtering to 574,740 education-linked chats from verified university email accounts, found 39.3% involved creating or polishing study materials (essay drafts, summaries, practice questions) and 33.5% asked the model to solve assignments directly. Hardcastle argues this lets learners produce polished outputs without the “cognitive journey” that builds critical thinking, and that students may start accepting ideas because an AI articulates them convincingly rather than because they’ve independently evaluated evidence. The concern is both pedagogical and systemic: generative models produce authoritative-sounding answers from opaque, black‑box training data, concentrating influence over knowledge pipelines in a handful of tech firms whose biases, design choices and commercial incentives shape what students learn. Hardcastle says universities’ current focus on plagiarism detection and AI literacy is insufficient; institutions must actively redesign pedagogy and governance so educators — not commercial platforms — steer how AI is used in learning. Without deliberate action (e.g., responsible‑AI centers and curriculum changes), she warns, generative AI risks reshaping not just assessment but the very nature of knowledge and reasoning in education.
Loading comments...
loading comments...