Dreamer Graph
A social dream journal — “The Napping Network.” Log dreams, get AI sentiment analysis and tag suggestions, browse a 300+ symbol codex, and surface recurring patterns in your subconscious over time. Currently in closed beta with a small test group ahead of a summer 2026 launch.
| Client | Personal project |
| Role | Interaction Designer — Product Design, Full-Stack Development |
| Deliverable | Web application (React + Supabase) — closed beta |
| Venue | Summer 2026 launch |
Overview
Dreamer Graph started as a personal experiment: a place to log dreams against a structured codex of symbol meanings, inspired by the way StoryGraph approaches reading data. The idea was to treat the subconscious like a dataset — not to interpret individual dreams in isolation, but to surface patterns in recurring symbols, emotional states, and the people who appear across entries over time.
The social layer came later. Dreams are inherently private, but there’s something interesting in the shared experience of recurring dream types — the falling dream, the late-for-something dream, the impossible geography. A lightweight social feed lets friends share entries and react to each other’s dreams without the weight of a full social platform.
The project is solo — concept, product design, and full-stack development. The stack is React + TypeScript on the frontend, Supabase for auth, database, KV storage, and Edge Functions, with OpenAI’s gpt-4o-mini handling per-entry AI analysis.
The Challenge
Private medium, social product
Dreams are personal by nature. Designing social features that feel inviting rather than exposing required careful thinking about visibility, consent, and what “sharing” actually means in this context.
AI that assists, not decides
AI-generated tags and sentiment scores needed to feel like a starting point, not a verdict. The interaction model had to keep the user in control of how their dreams are labelled and interpreted.
Making data feel personal
Analytics on dream content risk feeling clinical. The graph and chart views had to surface genuine insight without reducing something subjective and strange to a bar chart.
Solo full-stack
One person across product design, frontend, backend, database schema, Edge Functions, and AI integration. Every architectural decision had to be weighed against the cost of maintaining it alone.
Approach
Product Design & UX
The entry flow was the first design priority — if logging a dream is friction-heavy, the product fails immediately. The quick-entry form on the dashboard accepts a dream narrative, comma-separated tags, and a people field. A single “Suggest Tags” button sends the text to the AI backend and returns contextual tag suggestions in a couple of seconds; the user picks what fits and discards the rest.
Star rating and date default sensibly so the common case — logging a dream from the night before — requires minimal input. The goal was to make the entry feel closer to opening a notes app than filling out a form.
Views were designed around two distinct modes: capture (Dashboard, Calendar) and reflection (Journal, Codex, Graph). Navigation between them is persistent and low-cost — moving from a new entry to the analytics view and back is a single click.
AI Integration
Two AI features run per entry: tag suggestion and sentiment analysis. Both are handled server-side via Supabase Edge Functions — the OpenAI API key never touches the client. Chose gpt-4o-mini for cost efficiency; per-save analysis at scale required a model that could run on every entry without making the product economics unworkable.
Sentiment analysis returns a 1–10 score and an emotion label (joyful, anxious, peaceful, fearful, and several others). These appear as colour-coded badges on the journal entry — red for fear, yellow for joy — and feed directly into the graph view’s sentiment trend line.
The interaction design deliberately keeps AI output as suggestion, not conclusion. Suggested tags are presented as options to accept or ignore. Sentiment labels are visible but secondary to the user’s own star rating. The AI enriches the entry; the user decides what it means.
The Graph
The Journal view is where the data becomes meaningful. A streak counter and activity heatmap establish the habit layer — current and longest recording streaks, similar to GitHub’s contribution graph. Below that, three configurable charts plot sentiment and personal rating over time with toggleable date ranges and chart types (line, bar, area).
A “Common Themes” view renders a horizontal bar chart of the user’s most frequent tags across all entries. “Dream People” shows which individuals appear most often — a view that tends to surface patterns the user hasn’t consciously noticed.
The graph view on the Dashboard adds light customization — users can configure what they want to see at a glance. The intent is to make the data feel like a personal record rather than a generic analytics dashboard.
Symbol Codex & Social Layer
The Codex holds 300+ dream symbols across categories — Emotions, Animals, Places, Objects, Abstract — with alphabetic filtering, category toggles, and a search bar that queries both symbol names and internal tags. Each symbol card surfaces related symbols that link through on click, encouraging exploratory browsing over direct lookup.
Symbol matching is automatic: when a journal entry is expanded, the app runs the entry’s tags against the codex using a three-tier match — exact name match, partial string match, tag-based match — and surfaces up to three relevant symbols inline. This connects the personal record to the broader symbol library without requiring any extra action from the user.
The social feed shows friends’ public and network-visible dreams with matched codex symbols displayed as pills. Reactions are lightweight and specific to the medium: 👁️ Seen this, ✨ Vivid, 😰 Unsettling, 💭 Relatable. The intent was to make the social layer feel native to the content rather than borrowed from a general-purpose feed.
Outcome
The closed beta is running with a small group of family and friends. Early feedback has focused on the entry flow (working well), the codex browsing experience (positive), and the social reactions (landing as intended — low-pressure and specific to the content).
Remaining pre-launch work includes dream visibility controls (the data model is complete, the UI is not yet built), friend tagging within entries, and the forgot-password flow. The codex is also slated for expansion before launch.
Target: public beta, summer 2026.