As generative AI becomes more visible across music, art, and design, the creative community is rightly asking questions. Does this technology help artists, or replace them? Is it empowering, or exploitative? Should we reject it altogether or shape how it’s used?
There’s strong criticism out there, much of it valid. Concerns around consent, sustainability, and fairness are serious. AI models are often trained on the work of human creators without permission. There’s a risk that what begins as a tool for making quick posters or experimenting with song structures ends up replacing the work of illustrators, composers, and musicians. And yes, these models use vast amounts of energy to train and operate, adding to the environmental cost of digital life.
But this can’t be the end of the conversation. This is not a new situation by any means.
Think back to the early days of graphic design software (or music editing software, or photo editing software – take your pick). It was expensive, complex, and out of reach for most people. Over time, the interfaces improved, and it became more intuitive. Now, almost anyone can create posters, edit album art, or make social media content using apps that require no formal training. AI feels like a continuation of that same process, lowering the barriers and simplifying the tools. That’s not inherently harmful. I don’t remember anyone fearing the downfall of any industry when tools were made to make their life’s easier.
In fact, maybe the real issue isn’t the technology itself but how people choose to use it.
Should someone use AI to create “a song in the style of [artist X]” or “artwork that looks like [artist Y]”? That’s an ethical line that many would (should?) say crosses into exploitation, taking someone else’s work without credit or consent. And yet, this behaviour is not new. Artists have always borrowed, adapted, and absorbed ideas. The difference is that AI can do it instantly, impersonally, and without accountability. That raises the stakes, but it also reminds us: the problem isn’t the tool. It’s how we use it.
Just because some will misuse the technology doesn’t make the technology itself the enemy. The problem is the user. It’s the systems we’ve built, the incentives we’ve normalised, and the polarising lens we often apply, labelling things good or bad, real or fake, art or not, when the truth is far more complicated. If we can step back from those extremes, we might be better placed to have a conversation that’s actually productive.
That includes a more balanced view of AI’s environmental impact. Yes, training large AI models consumes a lot of energy. But so do many other pillars of the modern music industry. Streaming platforms like Spotify rely on massive data centres that run continuously. Social media promotion — now a core part of any artist’s toolkit — requires global cloud infrastructure. Touring involves flights, freight, fuel, hotels, and venues with large energy demands. Even home recording, often seen as low-impact, depends on high-spec computers, digital processing, and cloud-based storage. In other words, the entire digital creative economy has a carbon footprint.
This doesn’t excuse AI’s energy use, but it puts it into perspective. If we’re serious about sustainability, we need to look beyond a single technology. We need to ask how the whole system can be made more efficient, more local, and more accountable, not just scapegoat the newest player in the room.
So Where Does That Leave Us?
AI isn’t going away. But we still get to decide how it fits into our creative lives. Will it be a shortcut that devalues artists or a tool that expands opportunity and access? Will it erase originality or support it with new possibilities?
These aren’t yes-or-no questions. They need reflection, not reaction.
Technology has always changed the arts from home recording to photo filters, drum machines to desktop publishing. In each case, we’ve had to reckon with new tools and new ethical choices. Generative AI is just the next chapter in that story. The challenge is to approach it with clear eyes, open discussion, and an unwavering focus on the values that matter: consent, credit, creativity, and care.
The technology is not the problem. The problem, and the potential, is us.