This website uses cookies to improve your experience. We’ll assume you’re ok with this, but you can opt out if you wish. View our Privacy Policy for more information.
Cookie Bite Streamline Icon: https://streamlinehq.com
January 28, 2026
No items found.

AI, Creativity, and the Danish Exception

Why generative AI is not an individual skill issue, but an institutional choice with cultural consequences

AI is not an individual problem. It is a structural choice.

Anyone who has not been sleeping under a rock understands by now that artificial intelligence can be a powerful productivity tool. In healthcare, industry, administration, and analysis, AI can free up time, reduce errors, and increase efficiency substantially. In these domains, it makes sense to talk about upskilling, competence development, and smarter workflows.

But the same logic does not apply to the creative fields. In fact, it points in the opposite direction.

In creative processes, AI does not contribute originality, judgement, or artistic risk. It contributes streamlining. It raises the floor, allowing more people to write a song for a party, generate a LinkedIn post, or deliver an acceptable first draft. But genuinely creative? No. What it produces is variation on what already exists. Mediocrity at scale.

Most people know this. The debate plays out repeatedly — from post to post, classroom to classroom, workshop to workshop. And yet, AI is still treated politically and institutionally as an individual issue: something each creative professional simply needs to learn to use better.

That is where the problem begins.

When a structural issue is consistently reduced to an individual competence question, it is not a misunderstanding. It is a way of avoiding conversations about structures, responsibility, and consequences. And that avoidance is particularly problematic in Denmark, because creativity has been one of our few genuine global competitive advantages — not in volume, but in quality; not in standardisation, but in distinctiveness.

From creative industries to content industries

Denmark has not led in the creative industries because we have produced more geniuses than other countries. We have led because we built structures that made creativity possible in practice: time, trust, institutions, public investment, and relative social security.

This is visible in film and television, where Danish writers’ rooms and long development cycles have produced series with international impact. It is visible in architecture and design, where solutions are not merely about form, but about ethics, use, and context. It is visible in the performing arts, where works emerge through rehearsal spaces, rewrites, mistakes, and collective processes.

All of this is characterized by the opposite of AI logic: slowness, friction, and uncertainty.

When generative AI enters the creative fields, a subtle but profound shift takes place. From creative industries — where investment is made in people and environments — to content industries, where the focus is on output, speed, and recognizability. AI excels at producing more content, faster. But producing more content is not the same as creating more culture.

It is worth remembering that generative models are trained precisely to imitate what already exists. They are statistical machines, not cultural actors. They can combine, vary, and repeat — but they cannot take responsibility, break norms, or develop taste. Yet these are exactly the technologies now being introduced into sectors whose legitimacy has historically rested on the opposite qualities.

Cultural institutions: from counterweight to acceleration zones

The problem is not that artists, writers, or designers experiment with AI. Creatives have always explored new tools. The problem arises when cultural institutions begin to legitimise the technology as a norm.

Cultural institutions are not neutral platforms. They define standards for quality, tempo, and working methods. When theatres use AI for text development, museums for mediation, production companies for idea generation, or educational institutions for teaching materials, they send a signal about what counts as legitimate creative work.

Internationally, the consequences are already visible. In Hollywood, both writers and actors have fought for contractual protections against AI replacing precisely those functions where learning, development, and income previously took place. In publishing, AI is increasingly used for slush reading and editing — not necessarily because quality improves, but because speed does.

The same movement is visible in Denmark: shorter development phases, more productions, fewer rehearsals. Rarely out of bad intent, but as part of an efficiency logic where no one wants to be the one “standing in the way of progress.”

But a cultural institution that primarily rewards efficiency is no longer a cultural institution. It is a content factory with a logo.

A new cultural class divide

The most serious consequence, however, is not artistic. It is social.

AI first replaces the tasks that previously made it possible to be young, inexperienced, and poor in a creative field: assistant roles, routine work, intermediate positions. This is where the craft was learned. Where rent was paid. Where access was gained.

When these functions disappear, creativity does not vanish. But access to it narrows. Only those with economic security, networks, or capital can afford to take the necessary risks. The result is a more homogeneous, more elitist cultural life — produced faster, but lived by fewer.

This is the paradox. Denmark has spent decades making creativity something one could work one’s way into — not something one had to be able to afford in advance. AI risks reversing that development.

Internationally, this realisation is beginning to take hold. In the United States, the debate is increasingly less about whether AI can make art, and more about who will be able to make a living from creative work in the future. The decisive issue is not quality, but pipelines and access.

In Denmark, this discussion has yet to be seriously conducted — politically or institutionally. Here, the question is still reduced to how the individual should learn to use the technology better.

When we talk about AI in the creative industries, we are therefore not ultimately talking about technology. We are talking about whether a society is still willing to invest in the slow, uncertain, and human processes that make culture more than content.

Creativity does not arise in a vacuum. It arises in institutions, communities, practices, and ways of life where experience can be built, shared, and challenged over time. If these structures are replaced by technologies designed for standardisation and speed, we do not merely change working methods. We change our relationship to meaning, judgement, and cultural continuity.

That responsibility cannot be placed on the individual artist, educator, or student. It belongs to politicians, cultural institutions, and business leaders — to those who shape the frameworks of the society we are to live in.

AI is not a force of nature. It is a choice.

And the choice is not about whether we can produce more content.

It is about what kind of human experience and cultural intelligence we want to pass on.

This is not a technological question.

It is a civilizational one.