A playground for exploring
complex systems — carefully.
Intentional explorations into AI-assisted guidance, data interpretation, and how people make sense of uncertainty.
Curiosity, with guardrails.
Explores: AI-assisted UX • Interpretive systems • Decision support • Data transparency
This playground is where I explore complex systems outside the constraints of a roadmap—AI-assisted guidance, data interpretation, and how people make sense of uncertainty when clear answers don’t exist. These are not product pitches or experiments for novelty’s sake; they’re intentional, carefully bounded explorations that mirror the challenges I design for professionally. I use this space to test how structure, language, and UX constraints can make ambiguity understandable without oversimplifying it—strengthening the systems thinking, judgment, and responsibility I bring to high-stakes, real-world products.
AI-assisted guidance · interpretive systems
Nefeli
An exploratory system for generating calm, contextual guidance in a highly subjective domain. Nefeli examines how AI-generated content can feel supportive and trustworthy when paired with clear constraints, explainability, and user control — without slipping into prescriptive or opaque advice.
Data interpretation · sense-making over time
Arounded
An exploration into how map-based systems can help people reason about environmental conditions over time — without turning uncertainty into false certainty. Arounded focuses on interpretive scaffolding, transparent data sourcing, and plain-language context instead of scores or alarms.
These explorations are intentionally scoped.
They’re spaces for testing ideas, constraints, and design judgment without the pressure to ship or optimize. What matters here isn’t completeness — it’s how systems are framed, how uncertainty is handled, and how trust is designed.