Technology

Art schools are being torn apart by AI

When my baby brother, a 3D modelling and animation student, talks to me about his projects and studies, the pride I usually feel is becoming increasingly tainted by a growing sense of dread. As a creative professional and former design student myself, I understand all too well how fierce the competition for postgraduate jobs will be, but his future is being threatened by something that barely even existed during my own time in higher education: generative AI.

College students are feeling that fear as well. Earlier this year, in a small protest at CalArts, posters that requested the help of AI artists for a thesis were reportedly altered with anti-AI messages and anti-AI flyers were placed around campus. A film student at the University of Alaska Fairbanks destroyed another student’s allegedly AI-generated display piece by physically eating it out of protest.

Right now, almost any creative task you can think of can be assisted or even entirely completed using generative AI tools. The technology has rapidly become more capable in just a few years. Text-to-image models like Midjourney and Google’s Nano Banana can spit out images in a wide variety of styles based on short descriptions. Music generators like Suno and Udio are allowing users to infiltrate streaming platforms with AI songs that sound kind of like popular human artists. AI video models like Veo 3, Bytedance’s Seedance, and OpenAI’s Sora (before it was killed off last week) are spooking actors, animators, and VFX artists alike. It’s difficult to predict which creative processes will get in the AI crosshairs next.

Meanwhile, reckless AI evangelists and grifters across social media platforms make wild claims about how much design and media can be automated without any professional skills every time a new model is released, despite the glaring copyright concerns that often surround such models. At the same time, AI providers like Adobe, OpenAI, and Google insist their tools are designed to aid creatives rather than replace them or reduce demand for their labor.

The message to creators is clear from all sides: embrace AI, or risk getting left behind. And sometimes that message is coming from the very art schools that exist to nurture creative skills. The Massachusetts College of Art and Design (MassArt), California Institute of the Arts (CalArts), London’s Royal College of Art (RCA), and many other creative-focused higher education institutions now encourage students across a range of disciplines to explore the current generative AI landscape.

“At CalArts, we aim to incorporate critical engagement with generative AI into our courses and programming to ensure our students can play an active role in shaping future technologies instead of simply reacting to them,” CalArts communications lead Robin Wander told The Verge.

That doesn’t mean AI tool guides are replacing existing curricula, or that students are necessarily expected to use the technology in their own work. They are expected to know how they can use AI, however. That includes its technical limitations, and often, the ethical and legal implications behind it. Many institutions have implemented AI usage policies for students and faculty in the last few years, which largely push the same message: it’s better to learn and understand these emerging technologies than risk being replaced by them out of complacency.

And while these institutions are grappling with the ethics of AI, they’re also recognizing the threat of the technology’s spread and dominance over creative industries.

“We recognize the complicated landscape of AI tools, many of which mine and share/sell user data, are trained on biased datasets, and have significant impacts on the environment,” reads one such statement published by the Pratt Institute. “At the same time, we also recognize that fluency with AI tools is a growing competency sought by employers and an area of professional development across many industries.”

The approach at CalArts is much the same. The school aims to provide the latest tools to its students alongside opportunities “to work directly” with the organizations like Adobe and Google developing them, according to Wander, while also encouraging “critical discourse on the cultural, creative, ethical, and environmental implications of using AI.”

The goal for art educators is to ensure creative professionals remain essential to their respective industries by helping them to either master AI tools or continually evolve to surpass them. For Ry Fryar, assistant professor of art at York College of Pennsylvania, attaining that goal means teaching students how AI tools can be used to complement their existing creative processes instead of eroding them. In many cases this comes in the form of ideation — using AI tools to visualize concepts and designs in the planning stages, but not for the final results.

“The focus is on creativity itself, because without that, the results are common, therefore dull and fundamentally inexpert,” Fryer said to The Observer. “We work with students on how to guide AI tools at a professional level, stay aligned with developing good practices, and understand current copyright law, ethics, and other standards for responsible AI use.”

Some courses require more direct involvement with AI tools, such as those provided by the Chanel Center for Artists and Technology — a new CalArts initiative that describes artificial intelligence and machine learning as key focus areas. At Arizona State University (ASU), a class called “The Agentic Self” will be led by musician will.i.am (aka William Adams) in Spring 2026, teaching students at the university’s Games, Arts, Media, and Engineering school how to build their own agentic AI system that can somehow serve as “a digital extension of their creative identity, curiosity, and goals.”

According to will.i.am, the course “represents a solution to AI replacing human jobs.” ASU says the partnership will build on the musician’s Focus Your Ideas (FYI) AI tool — a creative ecosystem that allows users to share projects with collaborators, generate text and images, and ask the platform’s chatbot for design advice.

“We are always looking for ways to innovate how we teach to better prepare our students to meet the moment,” ASU President Michael Crow said in the announcement. “Our graduates must be ready for the powerful shift in jobs toward AI.”

Some students and educators haven’t taken kindly to generative AI tools becoming a part of creative courses, mirroring the negative sentiments that are also being widely expressed by professionals in the industry. There are concerns surrounding how generative AI models are trained — in many cases, by scraping protected works without the creators’ consent or providing compensation — and how automating design work may result in fewer job opportunities as companies try to cut their staffing costs.

I doubt many students passionate enough to study a skilled creative craft (and pay the often lofty education costs to do so) are thrilled about becoming overqualified prompt engineers. One study conducted by the Ringling College of Art and Design in late 2023 found that 70 percent of its students felt “somewhat” or “extremely” negative toward AI, and most outwardly stated they didn’t want it in the curriculum.

Still, creative institutions are pushing ahead regardless. Wander says that schools have a responsibility to help students explore and critique these tools directly because technology will always be a part of the creative industries.

“This is the best way to equip creative communities with the skills and knowledge to influence how these tools evolve or and how they are used in creative work,” said Wander. “As with any emerging technology, there are a range of perspectives among students and faculty about AI in the creative industries. Some are deeply skeptical. Some are early adopters.”

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.





Source link