I started building things on the internet when it was still mostly text. In 1997 I co-founded a Denver technology company called Dream Team Technologies. We won the Colorado Business Ethics Award and ran for seven years. After that I spent a long time helping other founders build their products, running JoyaSolutions, a consulting firm that worked with early-stage teams to ship product and train up the people who would carry the work forward.
For six years I led Holo, the company building Holochain, a peer-to-peer protocol. I wrote about open-source licensing as a form of self-sovereignty. I learned what it takes to design infrastructure that does not require its users to trust a single company.
I now run Cirdia, a privacy-architected platform for women's wellness. We are building Tapestry, Nourish, Groundwork, and other apps that surface pattern intelligence without centralizing user data. The architecture is the product: data that stays with the user, no advertising surveillance, no third-party sharing. It is a small company doing specific things carefully.
I also host Terms of Service, a podcast where I sit down with the people thinking hardest about privacy, AI, governance, and design, and ask them the questions I want answered.
What I keep coming back to
Software encodes the values of the people who write it. That has always been true. What is new is that AI now sits between us and almost every interaction we have with our information, our health, our work. The systems doing the inferring are trained on a corpus that systematically underrepresents women: our voices, our bodies, our experience. The prompts shaping how those systems answer are written, overwhelmingly, by people who do not know what they do not know.
So I keep asking: who is writing the prompts? Whose lived experience is the design centering? What assumptions are quietly encoded in the infrastructure we treat as neutral?
These are not abstract questions. They show up in whether a wellness app stores your cycle data on a server you cannot reach. They show up in whether a wearable returns your data to you or sells it to an advertiser. They show up in whether AI talks down to women about their own bodies because the corpus it learned from did.
I have been watching patterns for a long time. The pattern I am most interested in right now is the one between who builds the technology and who lives inside its consequences. I am currently opening conversations about what a women's wellness AI cooperative could look like, and who would build it.