An opinionated PRD planning assistant for creating structured Product Requirements Documents that work perfectly with Ralph, the autonomous coding agent. Learn more and follow my journey with real world project examples on medium, https://medium.com/@cmcintosh_3425
Edna Krabappel is a conversational web application that helps you plan software features by:
- Guiding you through feature planning with clarifying questions
- Building structured user stories with proper acceptance criteria
- Estimating story points and pushing back when stories are too large (>6 points)
- Ensuring acceptance criteria are verifiable and specific
- Exporting PRDs in Ralph's
prd.jsonformat
- Chat Interface: Conversational planning with an AI assistant
- Story Point Estimation: Automatic estimation with warnings for oversized stories
- Multi-Project Management: Manage multiple projects and features
- Ralph Integration: Export PRDs directly in Ralph's format
- LLM Flexibility: Switch between Claude API and Ollama/local models
- Python 3.11+
- Node.js 18+
- Claude API key (optional) or Ollama running locally
cd backend
# Install dependencies
pip install -r requirements.txt
# Configure LLM settings
# Edit backend/settings.json:
{
"llm": {
"provider": "claude", # or "ollama"
"model": "claude-3-5-sonnet-20241022", # or ollama model name
"apiKey": "your-claude-api-key", # for Claude only
"ollamaUrl": "http://localhost:11434" # for Ollama only
}
}
# Start the backend
python main.pyThe backend will run on http://localhost:8000.
cd frontend
# Install dependencies
npm install
# Start the development server
npm run devThe frontend will run on http://localhost:5173.
- Click "+ New PRD" in the sidebar
- Enter:
- Project Name: e.g., "MyApp"
- Feature Name: e.g., "task-priority" (kebab-case)
- Description (optional): Brief description of what you want to build
Edna will:
- Ask clarifying questions with A/B/C/D options
- Propose user stories one at a time
- Estimate story points for each story
- Push back if a story is too large (>6 points)
- Suggest how to split oversized stories
Example conversation:
You: I want to add a task priority system
Edna: Got it! A task priority system would let users mark and filter tasks
by importance. Before we start building the PRD, I have a few clarifying questions:
1. What priority levels do you need?
A. Just High/Low (2 levels)
B. High/Medium/Low (3 levels)
C. 5-level priority scale
D. Custom/user-defined levels
[continues with more questions...]
You: 1B, 2C, 3C
Edna: Perfect! Let me break this down into manageable stories. Here's the first one:
### US-001: Add priority field to database
**Description:** As a developer, I need to store task priority so it persists across sessions.
**Acceptance Criteria:**
- Add priority column to tasks table: 'high' | 'medium' | 'low' (default 'medium')
- Generate and run migration successfully
- Typecheck passes
**Story Points:** 2
**Reasoning:** Simple database change with migration, low complexity
Does this look good?
Switch to the "PRD Preview" tab to see:
- All user stories structured and formatted
- Story points for each story (color-coded by size)
- Total story count and points
- Acceptance criteria checklists
When your PRD is ready:
- Copy to Clipboard: Click "Copy JSON" to copy the PRD to your clipboard
- Download: Click "Export prd.json" to download the file
The exported JSON is in Ralph's exact format and can be dropped directly into your Ralph project directory.
Edna uses these guidelines when estimating:
- 1-2 points: Simple change (add a field, update text, basic UI tweak)
- 3-4 points: Moderate change (new component, API endpoint, migration)
- 5-6 points: Complex change (multiple files, integration logic, state management)
- 7+ points: TOO BIG - Edna will push back and suggest splitting
- Get an API key from console.anthropic.com
- Edit
backend/settings.json:
{
"llm": {
"provider": "claude",
"model": "claude-3-5-sonnet-20241022",
"apiKey": "sk-ant-..."
}
}- Install and run Ollama
- Pull a model:
ollama pull llama2(or another model) - Edit
backend/settings.json:
{
"llm": {
"provider": "ollama",
"model": "llama2",
"ollamaUrl": "http://localhost:11434"
}
}PRDs are stored as JSON files in data/:
data/
├── myapp-task-priority.json
├── myapp-user-auth.json
└── otherapp-dashboard.json
Each file contains:
- The PRD structure (matching Ralph's format)
- Conversation history (for resuming chats)
- Story point estimates
- Plan your feature with Edna
- Export
prd.json - Copy to your Ralph project directory
- Run
./ralph.shto have Ralph implement the stories autonomously
Edna_Krabappel/
├── backend/ # FastAPI backend
│ ├── main.py # API endpoints
│ ├── models.py # Pydantic models
│ ├── storage.py # JSON file storage
│ ├── llm.py # LLM provider abstraction
│ ├── prompts.py # System prompt loader
│ └── settings.json # LLM configuration
├── frontend/ # React frontend
│ └── src/
│ ├── components/ # React components
│ ├── App.tsx # Main app
│ └── api.ts # API client
├── data/ # PRD storage
└── prompt.md # Edna's system prompt
- Check Python version:
python --version(needs 3.11+) - Install dependencies:
pip install -r requirements.txt - Check for port conflicts (port 8000)
- Ensure backend is running on port 8000
- Check browser console for CORS errors
- Verify API_BASE in
frontend/src/api.ts
- Claude: Verify API key in
settings.json - Ollama: Ensure Ollama is running (
ollama serve) - Check model name matches available models
- Restart the backend after updating
settings.json - For Claude: Ensure
apiKeyis set - For Ollama: Ensure Ollama is accessible at configured URL
Future enhancements:
- Settings UI for configuring LLM without editing JSON
- Direct Ralph integration (start Ralph from Edna)
- Real-time Ralph execution log viewer
- PRD templates for common feature types
- Multi-user support with authentication
Same as Ralph Wiggums project.