Yesterday, I was part of a recording session for the Fueling Creativity in Education podcast, co-hosted and produced by Dr. Matthew Worwood and Dr. Cyndi Burnett. (Special thanks to both of them for having me as a guest, and please consider a subscription!) It was an invigorating conversation, and a topical one, as valuing human creativity in a world of artificial intelligence seems to be something that has a particular urgency in the here and now. Let's examine AI and creativity separately first.
![]() |
| Dr. Cyndi Burnett, right before the recording began. I was part of the photo collage of guests, celebrating the podcast's recent fifth anniversary! |
When we examine a snapshot of artificial intelligence today, it seems necessary to pull out the quote marks. We may take some comfort that for all of AI's remarkable speed and agility to "write" or "draw" or "ideate," it is essentially a highly efficient pattern-maker and builds its product-making programmatic skill on the backs of human ingenuity, and a fair share of its generated images and videos still fall into the AI slop category of questionable quality. (Whether those platforms fairly compensated or even appropriately cited the human ingenuity that trained it, or whether AI is worth the environmental impact, is a whole other ball of wax I'll set aside for the moment.) However, people have also debated what products humans have made are deemed worthy of the label "art" for thousands of years, and here begins a deluge of debates. Are we as humans not also a product of our times, influenced (subconsciously or not) by all that came before us? How do we define the purity of a so-called "original thought"? Can a first-grader be called a writer, even if their handwriting is uneven and their thoughts are fairly trite? Does a text qualify as a literary novel if it is self-published, full of typos and cliches, but has a million downloads? When do the air-quotes end and the scepter is deigned by some credentialed institution to bestow someone or something as a true writer, writing, art or artist? If Chaucer is a rulestick with which we should judge Western literature, how do we evaluate the writing output of a human child, or a popular hack, or a machine in comparison? Of course, wherever you may land on the spectrum of philosophical debate on the ethics of artificial intelligence (and to be clear, I have concerns about AI on several levels even as I use it and see its potential strengths), we all must agree that AI today is the worst it will ever be, performance-wise. We have to remind ourselves that ChatGPT -- or at least the version 3.5 that was born on November 30, 2022 -- is not even four years old. Even if you think all it does is connect pre-existing dots and wave a wand dispensing ephemeral digital parlor tricks, ChatGPT (and its various LLM siblings) is already pretty impressive for an immature digital child-bot not old enough to be in kindergarten yet. (For the record, my anthropomorphism -- and other juxtapositions of human and machine above -- are meant to provoke thoughtful debate that will only grow more heated as lines blur while AI capability grows. With that said, in a world where people talk with machines that are often sycophantic and merely mimic human feelings, we need to be repeatedly reminded that artificial intelligence is not a person. It's refreshing to hear a company like SchoolAI decide to make their chatbot Dot no longer have a cute face, because "[s]tudents need to know when they're talking with AI.")
We can now move on to another question that requires air quotes: what do we mean by "creativity"? In the world of education, we sometime suffer for want of common nomenclature in defining creativity, in the same way we struggle in agreeing on the same definitions of "engagement" or "collaboration." There often is a shoulder-shrug of you know it when you see it level of understanding, and that applies to most of us when we try to define something as squishy as creativity. Perhaps more problematic is the onerous task of recognizing creativity in students, or even more difficult, cultivating it. A beleaguered teacher might ask, What gives me the right to judge or teach others about creativity, when I may not consider myself creative in the first place? To put it mildly, it's a big challenge. And yet, I believe most of us agree that if public education is to endure and thrive, it must do a better job for its learners at nourishing the human-centered notions we value, such as joy, relationships, and yes, creativity.
The scholarship behind creativity, particularly in education, is huge, and I would not pretend I could encompass or summarize all of that brilliance in a single blog entry. But luckily, we have a good start with my recent podcast hosts! Dr. Burnett is the director of Creativity and Education, which offers a "Five-Point Star model" on how to pragmatically bring creativity into your classroom: Understanding Creativity, Recognize Your Own Creativity, Support a Creative Environment, Bring Creativity Into Your Lessons, and Teach Creativity as Its Own Skill. Dr. Worwood, along with Dr. James C. Kaufman, designed the CAUSE Model of Creative Languages (Connect, Apply, Understand, Share, and Express), and they "consider how an individual’s varying levels across these five Creativity Languages (innate, proficient, independent, basic, or dormant) may influence creative behavior, choice of domains to pursue, and potential eventual success." Until their article, I hadn't heard of the Four C Model of Creativity (designed by Dr. Kaufman and Dr. Ronald Beghetto). What I found useful in that Four C Model was the idea that creativity doesn't have to always be equated with genius, which makes being creative feel elitist and impossible to reach or teach. Instead, Kaufman and Beghetto describe a continuum of creativity:
- "mini-c": anything "new and meaningful," although perhaps limited to only personal value
- "little-c": "[w]ith appropriate feedback, advancements are made and what was created might be of value to others"
- ,"Pro-c": "the ability to be creative at a professional level and in a professional venue"
- "Big-C": something that "will be remembered in the history books"
As an educator, that already gives me relief; not every act of creation must have a "Big-C" level of impact. At minimum, we are all capable of "mini-c" and "little-c" moments and can recognize and encourage them in others.
But how do we foster creativity in a world where AI threatens to simplify (or, perhaps more apt, sloppify) inventive human thinking? The theme of the April 2026 issue of Educational Leadership is "Igniting Curiosity in Schools," invoking another "c" word strongly related to creativity. In the article "Sparking Curiosity with Applied Intelligence" by Elizabeth Agro Radday and Matt Mervis, the title draws the distinction between students passively using AI and actually applying it to solve authentic questions and problems: "When students use AI, they rely on it to provide an answer, often bypassing productive cognitive struggle. When students apply AI, they expand their curiosity and creativity and become creators. In these cases, AI is part of the solution to a larger, messier problem that cannot be solved or answered with a simple prompt." While they point out a study where college graduates are struggling to find entry-level positions thanks to AI phasing out such jobs, they also rightfully put some of the blame on traditional school systems. "The automation of low-level tasks," warns Agro Radday and Mervis, "the very 'clerk work' we often assign in schools, is already having a disproportionate impact on young people entering the workforce." We need instructional environments that encourage students to be creative, not compliant cogs mindlessly completing transactional tasks. If we keep treating students like machines, they will be replaced by them.
Naturally, this connects to late 19th century utopian novels and William Morris. But hear me out, I can explain.
Back in 2001, I wrote a "hyperessay" for an English undergraduate University of Louisville course on Mark Twain's A Connecticut Yankee in King Arthur's Court. I want to emphasize this was created in HTML when hyperlinking text felt almost transgressive, barely three years after I had gotten my first personal computer (Windows 98!). My thesis was that you gained a deeper understanding of Twain's novel if you contextualized it within the utopian fiction written in the same time period (1871-1891) as Yankee's publication (1889). As part of my research, I read several utopian novels, one being William Morris's News from Nowhere (1891).
The 1800's was the Industrial Age. For most people of the era -- like Yankee's protagonist Hank Morgan -- technology was welcomed, inspiring, and unquestionably viewed as progress. It therefore should be no surprise that tech was not only foregrounded in many of the utopian novels of the 19th century, but that two-thirds of all utopian novels were written in the 1800's (Lewis Mumford, The Story of Utopias, 1962).
News from Nowhere, however, was different. William Morris is a fascinating Britain whose influence continues into the present, even if his name may not be well known in the United States. A Romanticist, artist, socialist, political activist, and prolific multi-genre writer, Morris was a close friend of poet-painter Dante Gabriel Rossetti (Morris's wife was his muse, and likely more) and a lover of architecture, medieval times, Icelandic sagas and Arthurian legends. He was also a founder of the Arts and Crafts movement, eschewing manufactured goods for handmade ones. These passions were clear in Morris's utopia, situated in a London of the future, but with a de-emphasis on impersonal tech replaced by a strongly pastoral, bucolic setting of artisans. That said, these utopians were not total Luddites -- they simply put technology in proper perspective. As one character puts it, "[W]hatever is made is good, and thoroughly fit for its purpose. Nothing can be made except for genuine use; therefore no inferior goods are made. Moreover, as aforesaid, we have now found out what we want; and as we are not driven to make a vast quantity of useless things, we have time and resources enough to consider our pleasure in making them. All work which would be irksome to do by hand is done by immensely improved machinery; and in all work which it is a pleasure to do by hand machinery is done without." (A fun aside: Morris also wrote high fantasy stories such as the novel The Well at the World's End [1896], inspired by Grail quests and Arthurian knights. If Morris's resistance to the Industrial Age and love of Nordic sagas aren't enough clues that J.R.R. Tolkien was influenced by him, consider The Well at the World's End also has a "King Gandolf" and a horse named "Silverfax.")
Surrounded by the belching smokestacks of British industry, Morris turned toward the pleasures of the tactile and handcrafted. Nearly a century and half later, we see Morris's aesthetic play out today. We seek vintage forms of pleasure ourselves. Vinyl record sales have never been higher, and wired headphones are making a comeback, part of a larger "analog lifestyle" movement. In education, many schools are banning cell phones during classtime and are having, as a recent The New York Times article put it in its title, "Chromebook remorse." Fear and/or frustration over AI is fueling analog over digital instruction. I have strong opinions on both sides of this divide. As a Digital Learning Consultant, I often see this as overreaction and a problem resulting from mis- or overused tools during the pandemic that have created lingering and triggering fatigue of edtech, and lament that technology isn't being implemented with the intentionality it needs to leverage student learning. As an author who recently wrote a book encouraging tabletop role-playing games in the classroom, I also appreciate the appeal of face-to-face student interaction using old-fashioned paper and pencil (and dice!). As a social media consumer who is inundated with AI slop videos in my feed, I can certainly sympathize with those who want a return to quaint times where we did not have such a "vast quantity of useless [digital] things," produced quickly, amazingly, pointlessly, and soullessly.
So, in the face of such a morass, what do we do and how can we move forward? As we ultimately return to the notion of creativity, I'd like to offer one possible answer on the subject as I complete threading the multiple topics of this blog entry together (from the seemingly random Chaucer reference in the beginning to the Middle Ages to Morris to Tolkien to the world's most famous fantasy-themed TTRPG). But first, a quick flashback to our podcast recording. Matthew and Cyndi discussed how barriers and boundaries increase creativity, even as logic presumes that it should constrain it. I couldn't agree more. It's the same thing that powers the best of games; despite what superficially seems to be the limitation of rules that you can learn in minutes (how each chess piece moves differently on the board), the creative possibilities and strategies are nearly endless, and it can take a lifetime to master. We cannot think outside of a box without the box in the first place.
Here is where artificial intelligence steps in, for good or ill. It may have its place, but we must be careful that it does not wipe away all constraints and with it, the chance for complex thinking. AI can potentially give us god-like abilities with a keystroke, but creativity will suffer if such synthetic omnipotence is used without a greater purpose. Also, omnipotence is boring. If a "problem" can be answered with AI that easily, what kind of problem was it really? There is a quantum difference between solving a linear equation on a worksheet and an authentic challenge that needs mathematical thinking to be solved. The former just needs a machine (and perhaps not even AI, but a pocket calculator). The latter needs us.
And this is where looking back to the medieval age may give us a way out. In the spirit of William Morris, perhaps we can start an Analog Artisanal movement. In the urgency of our present circumstances, we do not need god-like powers, but guild-like powers. We need teams of humans, talking through our differences of opinions, working collaboratively, holding each other to the highest standards, looking for ways to apprentice the next generation to build our future world with care.
We need to celebrate and cultivate that creativity that lives within each of us, knowing that we will only hone our craftsmanship after mentorship, practice and failure. We may still need a forge or an anvil to get there. But we should never mistake the anvil for the blacksmith.
When the podcast is published in May, I will update this blog entry with a link!

No comments:
Post a Comment