9 Comments

Thank you for soliciting and sharing these reflections, which provide useful food for thought. To add an observation of my own: I think the contributions are focused a little too much on what we can or can't do with AI, and not enough on how we may be shaped by it. None of your contributors have been molded by AI's ready availability; they all come to these questions already formed by a deep education and long experience of thinking carefully about complex questions. I teach at a small, non-elite, Christian university. I'm a political theorist, but at a small school I teach widely, cross-listing several courses and teaching in our general education humanities sequence. My students come to me as 18-year-olds. Few of them have received especially good educations in the past; few have been encouraged to develop a deep love of learning; those who are bright may have learned mainly how to succeed reasonably well with modest effort. (Of the exceptions to this general description, probably 75% have been homeschooled.) What I most want them to take away from my classes (and those of my colleagues) is a love of learning, a sense that the world is a vastly interesting place, a desire to keep wondering and asking questions and exploring new things. They will remember relatively little of the specific content I teach them, but if they leave college with this, it will stand them in good stead for the rest of their lives.

I worry that easy access to AI will make this much harder. It has become far too simple for students to generate written work that is pretty good without ever doing the work of reading the assignments and engaging with them. The temptation not to think, and not even to want to think, is very powerful, and it is asking a lot of an 18-year-old to resist that temptation. As C. S. Lewis wrote in "The Weight of Glory," there are certain experiences that bring great rewards--his example was learning Greek--but rewards that are only available to those who have already become adept at the relevant activity. My experience so far--limited, obviously--leaves me afraid that AI is making it very easy indeed for students to "succeed" in ways that will leave those rewards forever unavailable to them.

Expand full comment
Aug 1Liked by Joel J Miller

Thank you. As a just-retired English teacher, I share your concerns. Sadly, many colleagues were all in for AI. I worry about what students are going to lose or never get.

Expand full comment
Jul 31Liked by Joel J Miller

Whenever I see an AI generated image, it leaves me cold. The lines of AI generated images are generally smoother than a photographed or painted image, yet there is a lack of depth in that very smoothness. I don't mean perspective depth, I mean emotional depth. AI images lack humanity.

So I would question the ability of AI to actually be able to detect the true depth of human creativity. Take the example used here of the symbolism of the yellow roses.

Dorothy L. Sayers, in 'The Mind of the Maker' notes that often a writer mau not realize all the foreshadowing and symbolism that is in their work. They may make a creative decision for one reason, and only later will readers see another reason. This matches my own observation that when a work, whether in words, images, or sounds, is good, it often draws a whole host of interpretations that the original creator had not thought of during creation. So, will AI really be conveying what Wharton was thinking about the yellow roses, or only what readers think she was thinking about the roses.

The rose example reminds me of a story that a relative told me about why he doesn't like to read. In high school, he had to read 'The Great Gatsby' and then answer questions about the book. One of the questions was 'Why was the light on the dock green?" My relative's answer was based on what he knew from living in a harbour town - he suggested the light colour was a signal for boats. His English teacher flatly told him he was wrong - she had wanted an answer that gave a symbolic significance. Her reply, and the contempt for him conveyed in it, convinced him that the apparent subtleties of literature were not for him. Even if AI could generate sophisticated answers about the significance yellow roses, if those answers will simply be used generate questions that make another generation of potential readers feel the machine knows better than they, it will only drive them further from the humanities.

Expand full comment
Aug 1Liked by Joel J Miller

I really like your post here. I don’t think AI belongs in literature or the other arts. It can’t touch the subtleties of real art. I have been surprised at how angrily some respond to criticism of AI.

Expand full comment
Jul 31Liked by Joel J Miller

Really interesting observations from different perspectives - thank you, Mr. Miller, for bringing them together. After reading through them, it occurred to me that authentic writing by humans might become treasured the way handmade furniture is today. You can buy inexpensive, mass-produced tables and chairs, but when you come across a handmade dining room table, you immediately perceive its higher quality.

Also, I teach high school math, but I've never accepted the word people/math people split. One of my school's English teachers let me teach a lesson on C. S. Lewis' Till We Have Faces, and I had one of our dance instructors use movement to teach vectors!

Expand full comment

This is one of the best explorations I’ve read on how AI might affect the arts/humanities, and what humanistic thinking can offer us now—thank you! Really appreciate the people you pulled together and your insightful questions. Like Shadi Bartsch, I’m quite interested in ways to bring together the “two cultures” of STEM and the humanities—they’re so often depicted as entirely polarized, antagonistic world views, but many of the thinkers I admire the most try to work across these 2 cultures! And that feels more and more necessary, as people start experimenting with AI for skills that are not merely quantitative (mathematical calculations, physics modeling, etc) but increasingly touch on skills typically associated with a humanistic education (reading texts, summarizing and synthesizing them).

I’m actually quite optimistic that art and literature will remain—people value the human subjectivity associated with cultural works! But I’m so curious (and simultaneously nervous and excited) to see how cultural production transforms because of AI—and how people will rethink labor and what it means to earn a dignified living from one’s work.

Expand full comment

I am especially glad to have been involved with this now that I have read the other answers—much excellent commentary here, and lots to think about.

Expand full comment
Jul 31Liked by Joel J Miller

Very intelligent piece. Who knew that Goethe inspired the induction motor!?!?

Expand full comment
Jul 31·edited Jul 31Liked by Joel J Miller

I’ll cut to the chase. AI, LLM, etc. can’t suffer. What I mean by suffering is the irreplaceable removal of something (or someone) resulting in lasting grief. Leading me to recast Descartes, “I suffer. Therefore, I think on my suffering and wonder who I am.” LLM can mimic it, but only as a reflection of the human record received. I’d suggest the diversity of creativity available only to humanity, in part, is derived from the wonder of actual or potential grief. Predictive LLM tools are useful, both in STEM and Humanities.

Expand full comment