Human-centered systems design for the AI era
You're ambitious, optimistic, and moving fast. You've got people building agents in every corner of the business. But are you and your leadership team aligned around the existential questions:
Who do we want to become on the other side of this transformation? And honestly — how many people do we need? What behaviors, skills, and human+AI systems do we need to get there? And what do we want those people to believe in, build toward, and find meaning in?
AI is the medium. Not the mission. In the pre-AI era, leaders set strategy and left the systems design to IT. That division has collapsed. Human+AI systems design is now every leader's job — and it's the wild, wild West. We can help you and your top leaders learn these skills and design your future, together.
The shift we're designing toward
Culture eats strategy for breakfast. Undesigned AI will eat your culture before you've finished your coffee.
For decades, leaders set strategy and trusted their culture to execute it. That worked — because the systems running the business were slow enough, human enough, and visible enough to course correct when they drifted from purpose.
That's no longer true.
AI systems move fast, operate invisibly, and scale whatever values — or absence of values — they were designed around. And right now, most of them are being designed by whoever is most enthusiastic, most technical, or most available. Not by the leaders whose job it is to steward the organization's purpose and future.
We believe the AI era can unlock a whole new and better world of work for humans. But only if leaders step up and into their new roles as human+AI systems designers and stewards.
The fixed stars don't change — your purpose, your values, what you will and won't trade away. But the vessel you build to get there needs to be consciously, intentionally, human-centeredly designed for this moment.
That's the work. That's why we're here.
A consultant who disappears and hands you a deck hasn't solved a design problem. They've made a category error — treating organizational design as something that happens to people rather than with them. The workshop isn't our delivery mechanism. It's our design principle.
The work we do together can't happen between meetings. It requires your leadership team to step out of the day-to-day, into a space designed for hard thinking — fully supported, with nothing to manage but the ideas in the room. We handle everything. You show up. We take care of the rest.
And the experience of doing this work together — the hard conversations, the honest design choices, the moments of genuine alignment — that is culture-building. You don't just leave with better foundations. You leave having built something together. That changes a team in ways a strategy offsite never does.
Multi-day workshops
Structured across multiple days to allow for divergent thinking, synthesis, and genuine decision-making — not just alignment theater. Each day builds on the last.
San Francisco by design
We love to host in SF — not for the geography, but the atmosphere. This is where the AI era is being built, with all its human complexity and genuine excitement. Absorbing that context changes how leaders receive the design work. Dinners, conversations, and — if you'd like — guest speakers from leading AI firms.
Skills, not just content
Your team leaves with frameworks, foundations, and new human+AI systems design skills. But they also leave having shared an experience — thinking hard things through together, in a new context. That shared experience is the beginning of the AI-ready culture you're building.
The best digital-first companies don't run on better org charts. They run on stronger priors — clear purpose, operational values, and shared design principles that let leaders make aligned decisions independently, without waiting for permission from the top.
Think of it as Bayesian thinking for business. The foundational layer — your purpose, your values, your design language — is the prior that doesn't change. Everything else: structure, tooling, roles, human-AI boundaries — these are posteriors. They update continuously as conditions shift, automatically, locally, without crisis.
The CEO of a well-designed organization makes fewer decisions. Not because they've stepped back — because they've built something that doesn't need them to step in. That's federated leadership. And it's what this work is actually building toward.
We didn't arrive at this work from one direction. These books represent the intellectual terrain — the design imperative, the honest account of what AI is actually doing, the counter-arguments we take seriously, and the thinkers asking the hardest questions about what it means to be human alongside other kinds of intelligence.
The foundational case for human-centered design at the strategic level. The methodological ancestor of what we do.
Design thinking belongs at the top of the organization, not just in product teams. The intellectual argument our practice builds on.
The most thorough exploration of purpose-driven orgs. Predates AI — more urgent now than when written.
Leadership and the New Science
Organizations as living systems. The theoretical backbone of why continuous redesign is necessary — not optional.
The operating model for organizations that must learn and iterate continuously.
Incorruptible Forthcoming
Watch this space.
The most rigorous account of how AI is restructuring firm economics. Essential context — even if our answer differs from theirs.
The most honest current book on working with AI, not around it. Directly relevant to our L3 work.
How to lead organizations built on collective creativity. The leadership model the AI era actually requires.
The case for purpose-driven, long-horizon leadership. The Why behind why all of this matters.
The book that put AI existential risk on the mainstream map. Understand the stakes before you design.
If AI does most of the work — what is human purpose for? The question hum(Ai)n exists to help leaders answer.
A clear-eyed case against technochauvinism — the belief that technology is always the solution.
When AI is designed without diverse human input, the bias isn't incidental — it's structural.
The Age of Surveillance Capitalism
What happens when technology platforms design organizations without human values as the constraint. Heavy but essential.
How technology undermines our capacity for collective understanding — and why that matters when organizations make AI decisions.
A radical expansion of what intelligence means. Changes how you think about the human-AI boundary.
We'd love to hear from you.
Get in touch →