Andy Kitchen
Andy Kitchen is a AI/neuroscience researcher, startup founder, and all-around hacker. He co-founded Cortical Labs where the team taught live brain cells to play pong. He's still trying to figure out how to catch the ghost in the machine.
Session
Community Notes - if you haven’t heard of this algorithm yet, you’ll come across it in 2025. Originated by the team at then-Twitter and developed open source, this year Meta will be adopting it across all platforms (and spinning down all other content moderation—but that’s a separate topic). Let’s focus on what’s interesting about Community Notes: a blend of algorithm and design that puts platform users in the driver’s seat. In their own words, “Community Notes aims to create a better informed world, by empowering people on X to add helpful notes to posts that might be misleading.” Importantly, also, “X doesn’t write, rate or moderate notes (unless they break X's Rules.)” This hands-off approach is particularly noteworthy in an era when model guardrails and constraints are increasingly part of the public consciousness, and not in a good way. People are bumping their heads as they sail into the wall at the edge of their AI-managed world, like Truman discovering the limits of his manufactured reality.
For anyone building an agent or thinking about dialogue-as-interface—this is the talk for you. Let's examine this approach to content moderation at the intersection of algorithm design and UX. It's an ambitious experiment: seeking truth not through central authority, but by bridging different human perspectives. We'll dive into both the mathematical engine that ranks these bridges of agreement and the design choices that drive user participation, while considering what this reveals about the broader challenges and possibilities of crowd-sourced content moderation.