Conflating Answers with Learning: Using AI to Learn


"A little learning is a dang'rous thing;

Drink deep or taste not the Pierian spring:

There shallow drafts intoxicate the brain,

And drinking largely sobers us again"

- from "An Essay on Criticism, Part II", Alexander Pope


Those who know me, or have read my profile online, know that I teach teams how to develop software using iterative tools and techniques. These require rewiring some of the traditional, waterfall-style thinking and approaches, and yes, they can be a culture shock. In order to warn my trainees I always go through a specific explanation with them. It kind of goes like this:

Many folks expect learning to be straight line. You do a training course, or read a book for something, then you use it. 

Unfortunately this is the South Park Underpants Gnomes model of learning. It leaves out the important part.


Learning does not work in a straight line. It's messy. It's unpredictable. You will make mistakes. Go down blind intellectual alleys. Get confused. Make amazing leaps forward. You will struggle at times and be tempted to give up (and trust me, it is very tempting to just give up and go back to the old way of doing things!). But these pathways you build by getting things wrong and working things through are what creates understanding. They are an important part of the learning process. They are the learning process. They provide the insights needed to truly understand a subject.

This process is summarised in this most excellent graphic from Nick Sousanis


Understanding the nature of learning shows up just one area that the current spate of overhyped AI engines is damaging us, insidiously dumbing us all down. One thing the AI companies would have us believe their modern day Mechanical Turks can do is absorb and assimilate information, critically assessing it and presenting it to us on demand. Setting aside the fact that this nothing more than a marketing lie - AI engines simply look for the most likely string of words to return in answer to a prompt; they are nothing more than stochastic parrots. Text extrusion machines that definitely do not "think", let alone critically - by simply presenting the most likely answer to a question takes away the learning process. It undermines deep understanding of subjects. It conflates answers with learning

So - is it possible to learn a subject using AI? I say not properly, and especially not if you are a raw beginner lacking basic concepts. It's like sitting an exam where you have been given the answers. Factual learning by wrote has its place, but for complex subjects you will be exactly the sum of your learning experience - in this case a parrot that can only repeat those noises it heard, unable to break out of its cage. For real understanding, you need to use Real Intelligence. Real Human Intelligence. 


Comments