Limicelia · Technology & AI
The question is not whether to adopt AI. It is whether the reorganization happening is the one you intended — and whether the people most affected had any say in it.
01 / What we are hearing
"Everyone on the team is using AI. We have no idea what we have collectively agreed to — or whether we have agreed to anything."
Governance gap
"AI is doing some people's jobs invisibly. Nobody is naming it. The people closest to that work are scared and not saying so."
Labor displacement, unspoken
"We have a public commitment to equity. The way we are adopting AI contradicts it. Leadership does not see it yet."
Values-practice gap
"We are a government agency implementing algorithmic tools that affect communities. Nobody in those communities helped design them."
Participatory design absent
"Leadership is excited about AI. Frontline staff are not. We keep talking past each other and nothing is moving."
Internal misalignment
"We built a decision-support tool for our team. Now the tool is making the decisions. Nobody noticed when that happened."
Authority migration
02 / What we actually do
We do not prescribe a technology strategy. We help organizations see clearly what technology is already doing — to decision-making, to labor, to power — and act deliberately from there.
AI Governance &
Decision Frameworks
What decisions does your organization need to make about AI — and who needs to be in those conversations? We work with leadership and affected staff to name what is being decided, map who has voice in those decisions, and build frameworks for ongoing governance rather than one-time policy. The output is not a document. It is a living practice.
Participatory
Technology Design
When communities are affected by a technology, they should have voice in how it is built. We design and facilitate co-creation processes that bring affected people into the development of tools, platforms, and systems — not as testers or focus groups but as genuine co-designers. This is distinct from user research. It is a different relationship to who holds authority over the design.
Organizational Readiness
& Internal Alignment
Before the technology question is a technology question, it is a readiness question. Is leadership aligned on what problem AI is solving? Is there a shared understanding of what the organization is and is not willing to automate? We work with teams to surface these misalignments before they become expensive.
Values-Technology
Gap Work
What your organization says about equity, labor, and community benefit should match how it makes technology decisions. When there is a gap — and there usually is — we help name it honestly and design a path toward alignment. This is not compliance work. It is the harder question of whether the technology an organization uses is actually an expression of the organization it says it wants to be.
03 / How this is different
04 / How to engage
Diagnostic
4–8 weeks
A listening process — with leadership, with frontline staff, with community if relevant — that produces an honest picture of how technology is affecting decision-making, labor, equity, and power in your organization. Not a technology audit. A relational inquiry. The right starting point before any strategy is designed.
Sustained Engagement
3–6 months
Building the structures, practices, and shared agreements that allow your organization to make technology decisions well — on an ongoing basis, not just for this implementation. For organizations implementing a specific tool or platform, or building a participatory design process for technology that affects a community.
A note on scope
Developing
We are developing this practice through live engagements. If the situation you are facing sounds like what we have described here, reach out. We will tell you honestly if we are not the right fit, and point you toward someone who is if we know them.
Invite-only evening · Thirty people · 2026
What is actually happening with AI inside organizations — told honestly, in a room built for it. Four or five leaders share what the public conversation won't say. The room responds. Nothing leaves without permission.
Duration
Two hours
Format
5 speakers · open conversation
Attendance
Invite-only
The gap
“Companies declared they no longer need engineers the way they used to. They are quietly posting hundreds of engineering roles. The public narrative about AI has come apart from lived reality.”
This gathering is for the people holding that distance.
A peer learning cohort for senior leaders navigating AI transformation — not a curriculum to complete, but a practice to build together. Eight sessions. A small group. Real situations brought by the people in the room.
Format
Biweekly, two hours each. In-person where possible, remote where not. The cohort stays together for the full arc — no drop-ins, no observers.
Who it's for
CHROs, COOs, heads of transformation, L&D leaders. People responsible for how AI lands in their organizations — and who want to think that through with peers, not just consultants.
Size
Small enough that everyone's situation gets genuine attention. Diverse enough that the range of experiences is useful. Curated, not open enrollment.
Focus areas — your choice
How each session works
A real situation brought by one member — live, unresolved, genuinely uncertain
The cohort responds — not to fix, but to think alongside. Facilitated by Limicelia
A practitioner or researcher joins select sessions — not to lecture, but to enrich
Between sessions — brief written reflection shared with the group. No homework. One question.
First cohort — 2026
We are forming the first cohort now. If this sounds like what you need, reach out. We will tell you honestly whether the fit is right.
It is genuine inquiry into what is happening in your organization. Tell us what you are navigating. We will tell you whether we can help — and if not, what might.
Begin a conversationTheme