Creative AI Invades the Classroom—and Everywhere Else What Does It Mean for Education and Work?

Creative AI Invades the Classroom—and Everywhere Else
What Does It Mean for Education and Work?

Michael Rogers

There’s little doubt: artificial intelligence is coming after human jobs, for everyone from the customer service rep on the 800 line to the young lawyer with a shiny new degree.

The key to this onslaught is machine learning—software that can train itself for jobs, rather than depend on strict programming by humans. The technology, in development for decades, has accelerated rapidly in the last ten years.

An early example of the technology was in 2016, when a computer called AlphaGO beat the world champion at the ancient game Go. Go is an incredibly complex game with 391 pieces but fairly simple rules—meaning it has a multitude of possible moves. Chess has about 20 plausible opening moves; Go has hundreds. Go Masters teach the game through metaphors and similes, rather than firm rules.

Researchers thought it would take until 2030 to teach a computer to win at Go. Then came self-teaching AI. The researchers at Deep Mind programmed two computers with the rules of Go, and then had them play each other, millions of times, learning constantly. The researchers then pitted the self-taught software against the reigning world champion and it beat him 4 out of 5 games—entirely self-taught. Another Go master, watching the game, said “It’s like another intelligent species opening up a new way of looking at the world.”

In recent years, with names like cognitive computing or deep learning, self-teaching AI has been everywhere: sorting through piles of evidence for lawyers; reading X-rays; creating new formulas for pharmaceuticals or battery electrodes; even developing a novel alloy for a cheaper US nickel. AI has also moved into the visual world—making robot vision far smarter, and also creating new images on its own, from cloning dead movie stars to creating art that actually wins prizes in competition.

The latest shock has been the success of the machine learning program ChatGPT in imitating human writing. It’s called a “large language model”—software that has learned to write like a human by reading and analyzing the vast amounts of digital content stored on the Internet. All you do is give it a “prompt”, such as: “I would like you to write an 800 word article about the future of artificial intelligence, giving examples of how it will be used and what the impact will be for humans and work.” Moments later, ChatGPT comes back with the article.

The website CNET has already started to use AI to write its news articles. Multiple companies are developing customer service representative apps that use the new technology. (McDonalds has for several years been testing robot order-takers in its drive-thru lanes.) Some bloggers use ChatGPT to craft their posts—you simply give the program a brief overview of what you’d like to discuss, and the software turns out a complete blog post. The results aren’t perfect, but basically you’ve got a first-draft that’s close to finished. (A few bloggers have used ChatGPT to write a blog post on “What is ChatGPT?”)

And finally, high schools and colleges are already banning access to ChatGPT on campus, fearing that students will use it to write their essays. And in fact, some already do: one professor in Michigan busted a student when they handed in a paper that was “suspiciously coherent and well-structured.”

Teachers are quickly adapting to the new reality. Some require students to write first drafts of essays while sitting in class. Software is being designed that will detect ChatGPT-authored essays. Colleges are even considering dropping the essay requirement on their student applications.

Those are defensive reactions. Some teachers are actively adopting Chat GPT in class as a teaching device, generating text that then drives classroom discussion. And, in fact, students need to be familiar with how to use automatic writing software—because it will ultimately be common in everyday life and business.

But perhaps the biggest lesson for teachers from ChatGPT is this: education must identify the unique human skills that AI and robots can’t duplicate. That will be crucial for future workers. I call these skills Three C’s.

An AI customer service rep will be extremely good at telling you everything about life insurance tailored to you needs. An empathetic human will be the one who talks you into raising the policy from $500,00 to $1,000,000.

There is a special energy in having multiple minds in the same room, brainstorming about a problem or challenge. Work to make AIs collaborate has been slow, and it’s not clear it will have a similar power as the human version.

These are problems where the boundaries for a useful answer are unclear. If you’re a city looking to put in a new parking lot, AI will do a brilliant job of going through traffic density, accident reports, legal issues, zoning, and construction costs, to pinpoint the most efficient place for new parking. But an AI will probably not ask: “Do we really want a new parking lot?”

These three skills are, of course, innately human abilities—but they are skills that young students, surrounded by distracting technology, may not learn or practice on their own.
Students need to be taught, in real life, and practice with one another.

Not coincidentally, educators will soon face a challenge to their own profession: AI will ultimately do part of what teachers do today, particularly in factual areas like math or chemistry or grammar. And so for their own job futures as well, teachers should begin to focus on the skills that can only they can teach: the Three C’s.

Speakers Mentioned

Your Speakers

Please add any speakers you are interested in to your list. You can send this list to us to inquire about availability.

Clear All

No Speakers in List

Scroll to Top