Life's not a rehearsal — unless it is

Generative AI is not inventing a new problem, so much as accelerating an old one.

A lot of our recent work in learning and development has been about providing encounters with situations people confront in real life. It's been about rehearsing scenarios and learning from those interactions. The simulations are driven by interactions with Large Language Models and are strongly normative. Participants are scored. Those scores are private, but nonetheless effective. The aim is to produce better performances. Real life is a good deal messier than a simulation, yet rehearsal produces effective results. In the end rehearsal is a tool to help people navigate the endlessly improvised real world.

Nathan Fielder in a pilot uniform being prepared by wardrobe staff, a scene from The Rehearsal

People talk about model collapse as if it were a new technical failure. It's a phrase used to describe how AI systems are themselves trained on AI output until the strange, marginal and less repeatable parts of the world are slowly squeezed out. The model learns to favour whatever is most likely to appear again. What survives is the familiar, the plausible, the statistically convenient.

But… this process is much much older than AI. Long before machine learning, all images, stories and symbols worked this way. They have never simply mirrored reality. They've always relied on conventions of likeness, probability and recognition - on what a culture is prepared to accept as convincing. Generative AI is not inventing a new problem, so much as accelerating an old one.

That's what makes what we're currently living through so strange and yet familiar. Model collapse is not just degradation. It's a more visible, more automated version of a longstanding tendency for representation to normalise itself: to reward what resembles what is already known, and to edge out what is rarer, harder to classify or less easy to repeat.

The result is a culture increasingly organised around recursive resemblance. Not reality, exactly, and not pure fiction either, but things that feel real because they are close enough to prior versions of the real.

Two men dressed as airline pilots in a cockpit simulator, a scene from The Rehearsal


Nathan Fielder's The Rehearsal captures this beautifully. Its whole method is compulsive restaging: building scenarios, adding variables, trying to produce reality through rehearsal.

Part of the joke is that this arrives in the wake of a replication crisis. The Sokal affair was amongst other things an exploitation of the crisis which makes it impossible to tell shit from sugar. What's worse is the social sciences have been forced to confront how difficult it is to reproduce results in systems of enormous complexity. Fielder turns that problem into comedy. The more rigorously he tries to engineer plausibility, the more absurd the result becomes.

A large puppet figure of a woman in a red dress holding a baby, seated in a room with cowboy-themed wallpaper, a scene from The Rehearsal


That feels very close to the logic of AI. We're no longer dealing with representation as straightforward copying, but with recursive systems that generate convincing likeness through repetition, probability and feedback. The result is not reality exactly, but something that passes for reality because it is close enough to what has already been seen, modelled and rehearsed.

For organisations, that matters because the risk is not only bad output. It is the quiet shrinking of judgment. Systems built to privilege plausibility can also flatten difference, suppress outliers and make the expected feel truer than it is.


Our work follows a similar but much less comedic path to Nathan Fielder. We are trying to create experiences that faithfully replicate real-life situations. They are safe spaces where people can rehearse what they might say or do in circumstances that themselves are often repeated. Overall they have a normative effect. This is entirely intentional. We're trying to train people - though more correctly trying to get people to train themselves - to behave in certain ways in certain circumstances. Predictable, reliable behaviour is great for insurance purposes. It can also improve measurable performance of both individuals and teams. The nagging worry is it eliminates the non-normative and perhaps innovative ways of looking at situations.

The moments when the simulations break down might be as instructive as when they completely take over.