People as Code 2 - Collapse, rewild, regenerate.
People as Code reflects on how generative AI, built from human data, now mirrors and shapes us. As AI flattens difference and optimises for plausibility over truth, culture risks collapse.
The original People as Code was a provocation written back in 2023, when the world was only just waking up to generative AI. I began it with no sense of if or when I might return, or where the thought might lead. Two years on, and we can’t escape GenAI. It’s everywhere, seeping into everything.
In Part 1, I imagined people as co-authors of systems, borrowing ideas from the digital world - modularity, repos of governance, commons infrastructure - and re-imagined ourselves as builders, co-authors, of living frameworks rather than passive components.
That early optimism feels more complicated now.
I wonder if the code, and those who shape the largest of models and the narratives of what is ‘right’, is starting to code us.
Generative AI is built from the code we produced: our voices, our texts, our stories - whether we knowingly consented or not. It mimics our tone, imitates approximations of us, and reflects them back as truth. The problem is that it never asks, “Is this actually true?” It only asks, “What sounds plausible enough?”
Every output of an AI model is, at some level, an approximate reflection of human data, our language, our creativity, our contradictions. But what happens when that reflection becomes the new input, when models start learning from themselves?
The more we take our cues from AI, the more we fill our work, our feeds, our very thoughts with generated content, the flatter the world becomes. Ideas start repeating. The edges blur. The sharp corners of innovation and difference are sanded down until we become a pleasant, forgettable average of averages. We are choosing convenience over truth.
I see convergence outside the digital world play out daily. Organisations learn from the same case studies, repeat the same language, apply the same frameworks. Funders drive a narrative of what is valued, and organisations follow suit. Optimisation replaces curiosity. Risk avoidance replaces imagination. The feedback loop tightens until innovation quietly disappears.
And when differences, novelty, experimentation disappear, we stop learning.
We are chasing efficiency, as if it is the answer. Why read the report when we have the summary? Why attend the meeting when the assistant can recap? We now have applications for writing and applications for reading - an ouroboros of automation.
Funding processes follow suit: plausibly approximate assessments producing plausibly approximate responses, all trained on data no one local ever shaped. The more we embed these systems, the more we encode existing power. The training sets, the fine-tuning, the institutions behind them all carry assumptions about what counts as credible or valuable.
But who gets to decide what is credible and valuable? Who gets to edit the next version? Perhaps credibility doesn’t live in datasets at all, but in the stories communities tell and keep alive.
In Data as Conversations I described how data, and by extension dialogue itself, should be treated as conversations, not verdicts - as living processes that evolve through participation rather than endpoints we rush to measure. When we reduce conversation to content, or data to dashboard, we lose the nuance that makes understanding possible.
What we need isn't always more polish, but more offness - the friction, the offbeat rhythm, the space between the notes, the strangeness that resists prediction. Originality rarely comes from what’s likely, it comes from what doesn’t quite fit.
The more we surrender our processes to generative systems, the more we risk being shaped by their smoothing logic. This matters deeply, especially in social and creative work. Whose voices get amplified? Which ones get tidied away? If we continue as we are, we already know the answer: the polished, the safe, the globally recognisable.
But maybe there’s another way. Paul Taylor reminds us that community memory outlasts organisational memory - that what endures isn’t the strategy deck but the relationships, the stories, the slow accumulation of trust. Communities remember what institutions forget. They hold the lived data that no model can scrape.
Here lies a possibility: to shift from extractive and destructive logics towards regenerative ones. Generative AI and what we take from this moment, doesn’t have to lead to a collapsing system, maybe it could help renew it. Rather than focusing on optimisation and convergence, we could focus on regenerative models, both in the digital world and in our organisations and sectors.
In the digital world, a regenerative model would learn with communities, not merely from them. It would adapt to context rather than erase it, emphasising diversity over convergence. Imagine a network of small, open, local models - trained on community histories, public archives, and collective wisdom - a fabric of plural intelligences instead of a single dominant one, a forest rather than a cultivated lawn. To get there we need to see the possibilities and act with intentionality and humility.
A forest rather than a cultivated lawn.
A real forest is not a singular thing. Trees of varying height and age shape the light, moisture, and sound of the air. Their roots map unseen geographies underground, each species holding space for others, shade for seedlings, hollows for insects, corridors for birds. Diversity above creates diversity below.
When leaves fall, they don’t vanish; they return. Leaf litter becomes mould, mould feeds fungus, and mycelium threads nutrients between roots - a vast cooperative network some call the wood wide web. Information, nourishment, and warning signals move through that underground mesh, not by command but by relationship. Each element, through its decay, makes the whole system more alive.
Our social and digital systems could learn from that. Most organisations, by contrast, are designed like plantations, rows of identical trees pruned for predictability. Each one vying for sunlight, roots separated, growth measured in quarterly metrics rather than centuries of soil. These systems can look efficient but they are ecologically brittle. One change in weather - funding cuts, leadership turnover, new policy - and the canopy thins fast.
A rewilded sector would look different. It would embrace both the slow growth and the rapid change: long term organisations or networks providing shade and stability, smaller ones taking root in the gaps, new shoots emerging from what decayed. It would recognise the importance of undergrowth - the informal networks, mutual aid, micro-projects, and community memory that keep nutrients circulating when formal structures falter. It would have humility enough to know that we can control our way to progress.
The digital commons we build could function the same way. Mycelium, after all, is a kind of distributed infrastructure - a shared substrate that lets nutrients, data, and signals travel where they’re needed most. We could design our networks, data systems, and AI models in that image: not centralised databases or one-size-fits-all dashboards, but relational infrastructures that hold many voices, many contexts, and let information move laterally, not hierarchically.
The health of a forest lies not in control but in connectivity - the slow, continuous conversation between what grows, what dies, and what feeds the next generation. That’s the logic of regenerative systems: diversity as stability, decay as design, relationship as infrastructure.
Perhaps that’s the invitation - to build our organisations, our technologies, and even our stories, less like manicured lawns and more like living woods: tangled, interdependent, and quietly, perpetually alive.
Maybe it’s time to rewild.
Rewilding doesn’t mean chaos. It means designing for complexity, allowing systems to self-organise and surprise. In digital terms, that could mean smaller, federated networks, open protocols, data commons governed locally. In the social purpose sector, it could mean funders trusting uncertainty, supporting plural experiments, letting communities define their own measures of success.
I wonder if by pretending we know what makes success and how we need to optimise, we risk creating a monoculture. Everyone planting the same crops of “impact frameworks,” all aiming for growth, all tending the same metrics of success. Rewilding it would mean letting native ideas return - slower, rougher, but more alive.
We’ve been trained to fear entropy - to see disorder as decay. But in living systems, entropy is what makes renewal possible. Rather than drive ever deeper into optimisation - efficiency at all costs, convenience over truth - we could design for entropy. Build systems and organisations that can fail gracefully, adapt, and regrow. Keep a little wildness in the loop.
Maybe imperfection is the last honest design principle left to us. Uncertainty by choice and by design.