This is a Node.js application that implements an AI agent based on a basic version of the Ralph Wiggum (as in Claude Code plugin implementation as of Jan 20th 2026) loop. It uses Ollama to generate code segments or commands to build, modify, test, and run code for a given project task. The agent operates in a loop: generating segments, executing them as Node.js code (using child_process for shell or other languages), handling errors, and checking for project completion.
Key features:
- Stateful progress tracking with logs.
- Customizable Ollama model and host.
- Adjustable context length (default: 42000).
- Strict output formatting from the LLM to avoid extraneous text.
- A REPL
- MCP support
- Advanced summary technology
-
Ensure Node.js is installed (v14+ recommended).
-
Install dependencies:
npm install ollama
-
Set up Ollama:
- Run Ollama on your specified host (e.g., http://localhost:11434).
- Pull the desired model, e.g.,
ollama pull ministral-3:8b.
Run the script with your project task as a CLI argument or a Markdown file path.
Basic command:
node index.js "Your project task here"--model <model-name>_: Specify the Ollama model (default: ministral-3:8b).--host <url>_: Specify the Ollama host (default: http://localhost:11434).--context-length <number>_: Set the context length for Ollama (default: 42000).
- Direct instructions: Pass as string arguments.
- Markdown spec file: Provide the file path; it will be read as the main task.
The app follows this process:
- Parse options and initial prompt.
- Enter a loop:
- Generate next code segment using Ollama (system prompt ensures raw Node.js code output).
- Execute as Node.js (via temp file).
- If error, append to error log and retry.
- If success, append to progress log and check completion with Ollama.
- Exit when "PROJECT_DONE" or completion check returns "yes".
For non-Node.js tasks (e.g., Python), the generated code uses Node's child_process to execute them.
node index.js --model ministral-3:8b --host http://192.168.50.135:11434 --context-length 50000 "Echo out 'Hello, World\!'."node index.js --model ministral-3:8b --host http://192.168.50.135:11434 --context-length 50000 "Calculate the 10th Fibonacci number using Python." --mcp '@modelcontextprotocol/server-everything'node index.js --model ministral-3:8b --host http://192.168.50.135:11434 --context-length 50000 --mcp '@modelcontextprotocol/server-everything'- If Ollama returns formatted output, adjust the system prompt or model.
- For long projects, increase context length to avoid truncation.
- Errors are logged to console; the agent auto-corrects via loops.
SPL-R5