Debo is not a journal with a chat box. It stores meaning, not just text — turning your life data into genuine self-knowledge.
Debo is a Life Intelligence System that learns from your writing, retrieves your history with citations, detects patterns across time, and turns memory into useful guidance.
- AI Memory Engine — Extracts facts, preferences, people, and goals from your entries into durable, long-term memory
- Ask Your Life — Ask natural questions like "When was I happiest this year?" with clickable citations
- Pattern Detection — Surface behavioral loops, recurring stressors, and emotional trends
- Memory Graph — Visualize connections between people, topics, and emotions
- Voice Capture (Jarvis) — Real-time ambient voice interface with sub-second response
- Framework: Next.js 16 (App Router), React 19
- Orchestration: Mastra (multi-agent system)
- AI: NVIDIA NIM / OpenAI / Anthropic
- Database: Neon (PostgreSQL), Qdrant (Vector DB)
- Media Storage: Google Drive (video/audio journals)
- Auth: Stack Auth
- Voice: LiveKit
- Deployment: Cloudflare Workers
📋 See Project Roadmap for feature status and upcoming work
# Install dependencies
bun install
# Copy environment variables
cp .env.example .env.local
# Set up database
bun run db:push
# Run development server
bun run devThen open http://localhost:3000
src/
├── app/ # Next.js App Router pages
├── actions/ # Server Actions (auth, chat, journals, memories)
├── components/ # UI components (dashboard, landing, chat)
├── db/ # Drizzle ORM schemas
├── lib/ # Core logic (AI, vector search, memory)
└── mastra/ # Mastra AI agents
Debo follows Editorial Calm — a design philosophy focused on:
- Distraction-free, magazine-like aesthetic
- Typography-first approach
- Minimal UI with generous whitespace
graph TD
User[User] --> Web[Next.js 16 Web UI]
User --> Voice[LiveKit Jarvis Voice Agent]
subgraph Edge[Cloudflare Edge]
Web --> Actions[Server Actions]
Web --> ChatRoute[Chat Runtime]
end
subgraph Orchestration[Mastra]
ChatRoute --> Agents[AI Agents]
Agents --> Tools[Tools]
end
subgraph Storage[Data Layer]
Tools --> DB[(Neon Postgres)]
Tools --> Qdrant[(Qdrant Vector)]
end
Agents --> LLM[LLM Provider]
LLM --> Stream[Streaming Response]
Stream --> User
- Journal Entry → Saved to Neon Postgres
- Embedding → Chunked and stored in Qdrant for semantic search
- Memory Extraction → Facts, preferences, people stored in memory table
- Query → Retrieved from both Qdrant (semantic) + Postgres (structured)
- Answer → Grounded response with citations sent back to user
MIT © SH20RAJ
