AI in Education May Require Us to Return to 15th-Century Pedagogy

An exploration of what we can learn from medieval Oxford about teaching in the age of AI

June 12, 2024, 7:22 p.m.

The year is 1450. You are among 10% of men (not to even mention women) who are literate in Britain. You are among the upper class – Clergy, Dukes and other local royalty – perhaps an early merchant.

You enroll at Oxford. After a year of studying you visit home. There, I ask you two questions:

  1. What are you learning?
  2. Why are you learning?

The ‘Why’ and ‘What’ of Pre-Industrial Education

The modern education philosophy has been dominant– with some marginal changes– for over a century (more on that later). But in the 15th century, the what and why of education were radically different.

In our studies of history we often focus on the religious aspect of medieval and pre-industrial education. And while this is a valid distinction, in some ways it misses the point for this article. The specific doctrinal importance of Catholicism or Protestantism as taught at Oxford is not as important as understanding the goal of this teaching.

The primary goal of early education was to enable sound thinking, not aggregation of facts, because there just weren’t that many facts to be known.

In the earliest days of Oxford, scholars had begun to translate and research Greek philosophy, and much of the goal of this early scholarship was to reconcile Greek and Christian theology, and integrate the seeds of scientific reasoning into European culture without upending its primary moral values.

It is easy to assume that the reason theology played a central role in early higher education was as a way of inculcating and indoctrinating the upper class in Christian theology. However, European history had already proven and would continue to prove that it is much easier to indoctrinate and control the uneducated.

Theology was taught because it was the prevailing epistemology of the time, and understanding it, as well as literacy and basic mathematics (so far as they existed at the time) were the primary cognitive skills needed by the upper class to administer in the government and rule of the country; the vast majority of the earliest graduate of Oxford worked directly for the monarchs.

In other words, the primary goal of early education was to enable sound thinking, not aggregation of facts, because there just weren’t that many facts to be known.

What did you learn? Sound thinking in the form of the prevailing epistemology of the time.

Why did you learn? To serve as a thought leader within society.

The Proliferation and Specialization of Knowledge?

What follows is a grossly oversimplified history of 500 years of education history. Over the 16th and 16th century, increases in literacy, reasoning, and scientific thought swept Europe via both the reformation and enlightenment.

From 1600’s to the present, the amount of information in the world exploded. We take for granted that any one person could only know a fraction of the scientific knowledge in the world. However, from a historical perspective this is a new phenomenon.

This proliferation of knowledge fundamentally changed what was needed in education. Education shifted from a focus on thinking to a focus on specialized facts.

Many educational researchers and philosophers have pushed back on this over the centuries, but especially with the rise of nationalized, standardized education, the focus has become on educating a society with the necessary concrete skills needed to succeed in a specialized job role. Beyond basic mathematical and literacy education, this meant acquiring specialized knowledge.

Between 1960 and 2010, the percentage of US adults with an undergraduate degree rose from 7% to almost 40%. High school graduation rates rose from 40% to 90+%.

The need to acquire specialized knowledge had peaked.

Commoditized Knowledge.

Knowledge is in the process of being “commoditized.” The marginal value of any fact is nearing zero. And we cannot change this.

In the background, however, things were beginning to change. First, the internet happened. Cliffnotes broke English teachers’ tests; answers to textbook math questions began to show up online. Companies like Chegg made millions of dollars based on things like ‘note sharing’ (or shall we just call it cheating?). Smartphones put this information in students’ pockets, making it even easier to cheat.

Then ChatGPT blew the whole thing up. The friction of “looking up” (though that term isn’t even fully correct anymore) information fell to 0. Instead of combing through search results and assessing their validity, the information was brought directly to you.

Knowledge is in the process of being “commoditized.” The marginal value of any fact is nearing zero. And we cannot change this.

Teaching for Thinking – 500 Years Later

In the 1600’s, there were very few facts. In the 2020’s there are millions of facts, but all but the most specialized are available in a moment. In many ways, the cognitive processes needed to succeed have come full circle, with critical thinking once again taking center stage.

So – how does that change the way we teach?

There are many related ideas floating around, many around assessment that aim to remove AI from education, thereby restoring the importance of facts. For example, a return to paper-and-pencil assessment and in-class timed essays.

I don’t have anything against these ideas per se, but they are largely an extension of the fight we’ve been having since the introduction of the calculator. It’s an arms race against technology.

What if instead of continuing this fight, we gave in and allowed the commoditization of knowledge? What would that look like?

Well, it would likely look much like 15th century Oxford. A time before stringent assessment, where the focus was more on writing, conversing, and thinking. A radical version of this may see public education transitioning more to the tutoring system still used at Oxford today – students pick up and reference the knowledge needed for learning prior to meeting in small groups with their teachers in academically rigorous conversation.

This would require us to radically rethink how we grade and assess. We may be able to put off this adjustment for another decade or two, but with GPT-4 passing the bar exam, it seems like the life of the bubble sheet is nearing an end.

There are less radical options that we can also begin to employ now without re-hauling our entire education system; the ‘flipped classroom’ is a simple modification where students watch videos (or, perhaps use an AI) to get the basics of a concept down, and participate in hands-on projects in class.

Slightly further afield, but still very achievable, are problem and project based learning. Both engage students in larger, more open-ended activities (with project-based learning focusing on longer activities that lead to a concrete real-world deliverable) that can take days or weeks.

The Role of the Teacher

Any way this is implemented, the role of teachers is changing drastically. The value of helping students learn facts has been in retreat for a century, and it’s about to completely collapse. Even middle school and high school teachers are going to be asked to become more professorial – helping students learn to think critically and ignite their passions.

This won’t be a small divergence from the educational system of the 20th century. In some ways, it will be a return to the 15th way of teaching.

Chatterpulse AI is an AI tool that focuses on helping teachers become more like those 15th century professors. If you'd like to learn more, check us out at

Follow us for more!