Skip to content

WembassyCo/Edna

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Edna Krabappel

An opinionated PRD planning assistant for creating structured Product Requirements Documents that work perfectly with Ralph, the autonomous coding agent. Learn more and follow my journey with real world project examples on medium, https://medium.com/@cmcintosh_3425

What is Edna?

Edna Krabappel is a conversational web application that helps you plan software features by:

  • Guiding you through feature planning with clarifying questions
  • Building structured user stories with proper acceptance criteria
  • Estimating story points and pushing back when stories are too large (>6 points)
  • Ensuring acceptance criteria are verifiable and specific
  • Exporting PRDs in Ralph's prd.json format

Key Features

  • Chat Interface: Conversational planning with an AI assistant
  • Story Point Estimation: Automatic estimation with warnings for oversized stories
  • Multi-Project Management: Manage multiple projects and features
  • Ralph Integration: Export PRDs directly in Ralph's format
  • LLM Flexibility: Switch between Claude API and Ollama/local models

Setup

Prerequisites

  • Python 3.11+
  • Node.js 18+
  • Claude API key (optional) or Ollama running locally

Backend Setup

cd backend

# Install dependencies
pip install -r requirements.txt

# Configure LLM settings
# Edit backend/settings.json:
{
  "llm": {
    "provider": "claude",  # or "ollama"
    "model": "claude-3-5-sonnet-20241022",  # or ollama model name
    "apiKey": "your-claude-api-key",  # for Claude only
    "ollamaUrl": "http://localhost:11434"  # for Ollama only
  }
}

# Start the backend
python main.py

The backend will run on http://localhost:8000.

Frontend Setup

cd frontend

# Install dependencies
npm install

# Start the development server
npm run dev

The frontend will run on http://localhost:5173.

Usage

1. Create a New PRD

  1. Click "+ New PRD" in the sidebar
  2. Enter:
    • Project Name: e.g., "MyApp"
    • Feature Name: e.g., "task-priority" (kebab-case)
    • Description (optional): Brief description of what you want to build

2. Chat with Edna

Edna will:

  • Ask clarifying questions with A/B/C/D options
  • Propose user stories one at a time
  • Estimate story points for each story
  • Push back if a story is too large (>6 points)
  • Suggest how to split oversized stories

Example conversation:

You: I want to add a task priority system

Edna: Got it! A task priority system would let users mark and filter tasks
by importance. Before we start building the PRD, I have a few clarifying questions:

1. What priority levels do you need?
   A. Just High/Low (2 levels)
   B. High/Medium/Low (3 levels)
   C. 5-level priority scale
   D. Custom/user-defined levels

[continues with more questions...]

You: 1B, 2C, 3C

Edna: Perfect! Let me break this down into manageable stories. Here's the first one:

### US-001: Add priority field to database
**Description:** As a developer, I need to store task priority so it persists across sessions.

**Acceptance Criteria:**
- Add priority column to tasks table: 'high' | 'medium' | 'low' (default 'medium')
- Generate and run migration successfully
- Typecheck passes

**Story Points:** 2
**Reasoning:** Simple database change with migration, low complexity

Does this look good?

3. Review PRD Preview

Switch to the "PRD Preview" tab to see:

  • All user stories structured and formatted
  • Story points for each story (color-coded by size)
  • Total story count and points
  • Acceptance criteria checklists

4. Export for Ralph

When your PRD is ready:

  1. Copy to Clipboard: Click "Copy JSON" to copy the PRD to your clipboard
  2. Download: Click "Export prd.json" to download the file

The exported JSON is in Ralph's exact format and can be dropped directly into your Ralph project directory.

Story Point Guidelines

Edna uses these guidelines when estimating:

  • 1-2 points: Simple change (add a field, update text, basic UI tweak)
  • 3-4 points: Moderate change (new component, API endpoint, migration)
  • 5-6 points: Complex change (multiple files, integration logic, state management)
  • 7+ points: TOO BIG - Edna will push back and suggest splitting

LLM Configuration

Using Claude API

  1. Get an API key from console.anthropic.com
  2. Edit backend/settings.json:
{
  "llm": {
    "provider": "claude",
    "model": "claude-3-5-sonnet-20241022",
    "apiKey": "sk-ant-..."
  }
}

Using Ollama (Local)

  1. Install and run Ollama
  2. Pull a model: ollama pull llama2 (or another model)
  3. Edit backend/settings.json:
{
  "llm": {
    "provider": "ollama",
    "model": "llama2",
    "ollamaUrl": "http://localhost:11434"
  }
}

Data Storage

PRDs are stored as JSON files in data/:

data/
├── myapp-task-priority.json
├── myapp-user-auth.json
└── otherapp-dashboard.json

Each file contains:

  • The PRD structure (matching Ralph's format)
  • Conversation history (for resuming chats)
  • Story point estimates

Integration with Ralph

  1. Plan your feature with Edna
  2. Export prd.json
  3. Copy to your Ralph project directory
  4. Run ./ralph.sh to have Ralph implement the stories autonomously

Architecture

Edna_Krabappel/
├── backend/                 # FastAPI backend
│   ├── main.py             # API endpoints
│   ├── models.py           # Pydantic models
│   ├── storage.py          # JSON file storage
│   ├── llm.py              # LLM provider abstraction
│   ├── prompts.py          # System prompt loader
│   └── settings.json       # LLM configuration
├── frontend/               # React frontend
│   └── src/
│       ├── components/     # React components
│       ├── App.tsx         # Main app
│       └── api.ts          # API client
├── data/                   # PRD storage
└── prompt.md               # Edna's system prompt

Troubleshooting

Backend won't start

  • Check Python version: python --version (needs 3.11+)
  • Install dependencies: pip install -r requirements.txt
  • Check for port conflicts (port 8000)

Frontend won't connect

  • Ensure backend is running on port 8000
  • Check browser console for CORS errors
  • Verify API_BASE in frontend/src/api.ts

LLM errors

  • Claude: Verify API key in settings.json
  • Ollama: Ensure Ollama is running (ollama serve)
  • Check model name matches available models

"LLM provider not configured"

  • Restart the backend after updating settings.json
  • For Claude: Ensure apiKey is set
  • For Ollama: Ensure Ollama is accessible at configured URL

Next Steps

Future enhancements:

  • Settings UI for configuring LLM without editing JSON
  • Direct Ralph integration (start Ralph from Edna)
  • Real-time Ralph execution log viewer
  • PRD templates for common feature types
  • Multi-user support with authentication

License

Same as Ralph Wiggums project.

About

Edna PRD tool for use with Ralph

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published