[Opinion] Critical thinking and AI
I was asked the question why don't I believe we teach enough critical thinking to manage AI?
Back in 2022/2023, I put together a team of academics, teachers and members of the education community and we mapped out education in Western systems from multiple different perspectives. None of the maps made sense until we discovered the anchors were wrong.
The anchors help define the purpose of the map and during our discussions we realised the purpose of education was not about student opportunity but economic activity. The real purpose was supporting the educational establishment and the Government's desire to produce useful economic units that support the market.
Once we changed the anchors, which took some real soul searching and difficult conversations from all those involved, the maps clicked into place. Remember, these were people who taught children and young adults, this change of purpose was anything but comfortable to them and had several unpleasant "ah-ha" moments.
We then used the maps to determine where to invest for societal and market benefit and aggregated across all the maps. This created the table attached which I published in 2023. From a societal benefit point of view we needed to invest in personal education plans (and lifelong learning) along with critical thinking. From a market benefit perspective, we needed to invest in new practices with AI and digital access.
We mostly have done the latter but not the former. We are preparing a population of useful economic units that can use and follow the outputs of large language models rather than people who can question and challenge those outputs.
Given these large language models are simply coherence engines rather than truth engines, we are driving towards a civilisation that is fluent in everything except truth. A population that will have more information than at any point in history, and simultaneously less capacity to judge any of it.
This is not the fault of teachers, this is a political choice mostly focused on the pursuit (or more accurately hope) of short term growth. If we train minds for coherence, for compliance then we shouldn't be surprised when they obey the machine. And a society that forgets how to challenge, how to doubt soon forgets how to think.
All of this harm, which is already being reflected in concerns raised by 13-18 year olds, is being compounded by deliberate messages of disruption that create a general fear within many students for their futures. Let us be blunt, venture capitalists in AI, by promoting narratives of mass job replacement, are contributing to a disturbance of social harmony through rhetorical manipulation for reasons of economic self interest.
We have no laws or protections for society against this. The UK is also rushing to surrender future sovereignty on the promise of growth by becoming a node of US AI infrastructure. We will not only forget how to think, we will surrender our capability to do so.
Originally published on LinkedIn.
