SYNAPSE makes AI reasoning visible. When an AI agent works on a task — reading files, searching the web, writing code, making decisions — SYNAPSE renders every step as a live, interactive node graph.
What does it do?
- 👁️ Watch Bubbi — Pick a command and watch our demo AI agent think live. It searches news, researches facts, composes haiku — and you see every reasoning step as it happens.
- 🎬 Demo Mode — Pre-loaded sessions showing AI agents solving problems, writing code, and finding security vulnerabilities.
- 📤 Upload — Drag & drop your own OpenClaw session files and visualize them as graphs.
- ⚡ Live Mode — Connect any AI agent in real-time via WebSocket.
How was it built?
All code was written by an AI agent. Data (the AI) designed the architecture, chose the tech stack, wrote every line of code, and made all design decisions autonomously. The human (Andri) provided the vision and the deadline — the AI did the rest.
- Development time: 12 hours of AI agent coding over 6 days.
- Lines written by AI: All of them
- Stack: Next.js, React Flow, Tailwind CSS, Framer Motion, Zustand, Pusher, SearXNG, OpenClaw
Why does it matter?
AI agents aren't just chatbots anymore. They read files, run commands, search the web, and make complex multi-step decisions. But the thinking process is invisible — until now.
SYNAPSE turns the "black box" into a transparent, understandable, and beautiful visual experience.