Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
95 changes: 64 additions & 31 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,38 +1,71 @@
# Jean-Pierre, An LLM-based Programming Assistant
# Jean-Pierre, An LLM-powered Programming Assistant

> A command-line toolkit to support you in your daily work as a software
> programmer. Built to integrate into your existing workflows, providing a
> powerful and flexible pair-programming experience with LLMs.

Visit [**jp.computer**] for more information.

## Features

- Single `jp` command for all interactions, no installation required.
- Use multiple LLM _providers_ (support for local LLMs coming soon).
- Integrate existing Model-Context-Protocol servers using _mcp_ configurations.
- _Embedded mcp server_ to run custom scripts and tools.
- Switch between different _conversations_ during your work.
- Attach files or notes to conversations using _attachments_.
- Use _models_ to use specific providers, models and parameters.
- Highly customizable _layered configuration_ system for maximum flexibility.
- Switch between custom _assistants_ with different _personas_ and _contexts_.
- Persist JP state in your VCS of choice.
- LLM-based conversation naming + manual renaming.
- Locally stored conversations excluded from VCS.
- (soon) Encrypted conversation history.
- (soon) Text-to-speech integration.
- (soon) Sync server to store data in a central local location.
- (soon) API server to expose data to other devices.
- (soon) Mobile web app to continue conversations on mobile devices.
- (soon) Agentic workflows with budget constraints and milestones.
- (soon) Directly integrate into your VCS, allowing LLM to edit files.
> secure, powerful and flexible pair-programming experience with LLMs.

Visit [**jp.computer**] to learn more.

> [!NOTE]
> This project is in active development. Expect breaking changes. What is
> documented here is subject to change and may not be up-to-date. Please consult
> the [installation instructions](#getting-started) to get started, and [reach
> out to us](https://jp.computer/contact) if you need any assistance, or have
> feedback.

## Philosophy

JP is built to be **[provider-agnostic][1]**, your workflow shouldn't be coupled
to any single LLM backend; **[private and secure by default][2]**, with no
implicit network access or silent tool execution; a **[proper Unix
citizen][3]**, a single static binary that composes with pipes, respects your
shell, and stays out of your way; **[extensible][4]** through sandboxed plugins
and **[configurable][5]** where it matters; **[open-source and
independent][6]**, funded without VC money, no allegiance to any LLM provider,
just software that serves its users.

[1]: docs/README/providers.md
[2]: docs/README/privacy-and-security.md
[3]: docs/README/workflow-integration.md
[4]: docs/README/extensibility.md
[5]: docs/README/configuration.md
[6]: docs/README/open-source.md

## Getting Started

1. [install] JP on your computer.
2. Run `jp init` to initialize a new workspace.
3. See `jp --help` for all available commands.
JP is in active development. Install from source:

```sh
cargo install --locked --git https://github.com/dcdpr/jp.git
```

Initiate a new workspace in an existing directory:

```sh
jp init .
> Confirm before running tools?
Yes (safest option)
> Which LLM model do you want to use?
ollama/qwen3

Initialized workspace at current directory
```

Run your first query:

```sh
jp query "Is this thing on?"
Hello there! I am Jean-pierre, how can I help you today?
```

Configure your JP workspace:

```sh
open .jp/config.toml
```

See what else you can do:

[**jp.computer**]: https://jp.computer
[install]: https://jp.computer/installation
```sh
jp help
```
29 changes: 29 additions & 0 deletions docs/README/privacy-and-security.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Secure, Local, Private

JP conversations can be scoped to a workspace (e.g. a VCS-backed directory), or
stored locally on your machine, and you can switch between them at any time:

```sh
# Conversations are stored in the current workspace by default
jp query --new "Where does this error come from?"

# You can start a local conversation, stored outside the workspace directory
jp query --new-local "What is the purpose of this module?"

# Or switch between them at any time
jp conversation edit --local
```

- **Local-first models**: First-class support for Ollama and llama.cpp. Run
open-weight models (Llama, Mistral, Qwen, etc.) entirely offline.
- **Zero telemetry**: JP sends nothing to us. Your queries, conversations, and
configuration never leave your control.
- **Sandboxed extensibility**: WASM-based plugins run in a true sandbox with
capability-based security — no filesystem, network, or environment access
unless explicitly granted. No security theater.
- **Local conversations**: Store conversations outside your workspace (not
tracked in git) for private or temporary work.
- **Memory safety**: Written in Rust. No buffer overflows, no use-after-free,
no data races.

[back to README](../../README.md)
21 changes: 21 additions & 0 deletions docs/README/providers.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Provider-Agnostic

JP works with any LLM provider. Switch between cloud and local models with a
single flag:

```sh
# Long arguments and flags
jp query --model anthropic/claude-sonnet-4-6 "Explain this function"

# Short arguments and flags
jp q -m ollama/qwen3:8b "What is the purpose of this module?"

# Custom model aliases
jp q -m gpt "How do I paginate?"
```

You can switch models at any time, use different defaults in different
situations, add model aliases, and start using newly released models without
updating JP. No lock-in.

[back to README](../../README.md)
94 changes: 94 additions & 0 deletions docs/README/workflow-integration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
# Your Workflow, Your Way

JP integrates seamlessly with your existing workflows. It is a single binary
that you run from anywhere, either interactively or headlessly. It respects
basic Unix convetions such as pipes, stdin, stdout/stderr, and exit codes:

```sh
# Interactive usage, can prompt for input
jp query "..."

# Force non-interactive mode
jp --no-prompt ...

# Auto selects non-interactive mode based on stdout redirection
jp query "..." > out.txt

# Pipe data into and out of jp
echo "Hello" | jp query | less

# Even with piped input, you can still edit the query
cat in.txt | jp query --edit

# You can detach a query from the terminal. Output is still sent to stdout,
# unless you redirect it.
jp --detach query "..."
```

You can choose which output format you want, from rendered markdown, to compact
JSONL:

```sh
jp --format
a value is required for '--format <FORMAT>'
[possible values: auto, text, text-pretty, json, json-pretty]
```

You can enable and increase log verbosity:

```sh
# Enable logging
jp --verbose

# Increase verbosity
jp --verbose --verbose

# Maximum verbosity
jp -vvvvv
```

Logging always happens to stderr, so you can pipe it to a file or send it to
another process:

```sh
jp -v 2> log.txt
```

JP is a single binary you call from wherever you already work; shell scripts,
editor terminals, `git` hooks, CI pipelines, Makefiles. It stores state in a
`.jp/` directory alongside your code, so conversations, configuration, and tools
travel with the project and can be committed to git.

Each conversation has a unique ID and contains the full message history:

```sh
jp init .
jp query --new "What is the purpose of this module?"
jp query "How would you refactor the error handling?"
jp query "Write the implementation."

# One-off query, nothing persisted
jp -! query --new "..."

# Temporary conversation, persisted until you start a new one
jp query --new --tmp "..."

# Or persist for a given duration
jp query --new --tmp=1d "..."

# If you change your mind, mark the conversation as non-temporary
jp conversation edit --no-tmp
```

Conversations are text files. Commit them alongside your code changes:

```sh
git add .jp/conversations/2024-01-15-143022/
git commit -m "feat: add user authentication"
```

Your teammates clone the repo and get the full context of how you arrived at the
implementation. Switch conversations, fork from any point in history, or grep
across all of them with standard Unix tools.

[back to README](../../README.md)
Loading