Member-only story

AI And The Critical Thinking Crisis

And how Isaac Asimov nailed the problem 67 years ago.

Michael Long
5 min readDec 30, 2024
Illustration by DALL-E, of course.

I’ve been reading a lot about allowing the use of AI-based tools in education in general and in computer programming classes in particular.

Read any such conversation online, and you’ll no doubt run into a lot of people saying that students should be allowed to use AI-based tools like ChatGPT in class and even during tests.

The logic being that they’ll have access to those tools when they’re working and if they haven’t been trained in their use then they’ll be at a disadvantage when competing against those that do.

But there are a few loopholes in that logic.

AI tools like ChatGPT are undeniably impressive. They generate answers, provide explanations, and even write code that works.

At least, most of the time.

But the problem with “most of the time” is what happens when it doesn’t?

When the AI gets it wrong, do those students or future developers have the experience or skill needed to spot the error? Or are they blindly trusting an answer that, to them, looks convincing but could be fundamentally flawed?

Especially when ChatGPT asserts that the answer is correct, and, as we’ve all no doubt…

--

--

Michael Long
Michael Long

Written by Michael Long

I write about Apple, Swift, and SwiftUI in particular, and technology in general. I'm also a Lead iOS Engineer at InRhythm, a modern digital consulting firm.

Responses (4)