Skip to content
Merged

Next #148

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
151 changes: 151 additions & 0 deletions docs/concepts/llm.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,151 @@
# Large Language Model (LLM) Configuration

OpenAF provides first-class support for Local Language Model (LLM) prompts via the `$llm` function in JavaScript jobs and the `oafp` CLI. You can configure which model to use by setting environment variables at runtime.

## Environment Variables

### OAF_MODEL

Used by the `$llm` function and OpenAF JavaScript APIs to select and configure a GPT model. If no explicit model is passed to `$llm()`, OpenAF will look at `OAF_MODEL` as a JSON or SLON string.

- Type: string (JSON/SLON) or simple `<type>` name
- Default key: `OAF_MODEL`

Examples:

```bash
# Full JSON configuration (SLON is also supported)
export OAF_MODEL='{ \
type: "openai", \
key: "YOUR_OPENAI_API_KEY",
model: "gpt-4",
temperature: 0.7,
timeout: 900000
}'

# Inline shorthand (must parse as a map)
export OAF_MODEL='(type: openai, key: "$OPENAI_KEY", model: gpt-4, temperature: 0.7, timeout: 900000)'
```

With `OAF_MODEL` set, you can use `$llm()` without arguments:

```javascript
// Inside an oJob or script
var response = $llm().prompt("Summarize the following text...")
cprint(response)
```

You can still override at call time:

```javascript
var response = $llm({"type": "ollama", "model": "mistral", "url": "https://models.local", "timeout": 900000})
.prompt("Analyze this data...")
```

### OAFP_MODEL

Used by the `oafp` command-line tool to drive prompts from your shell or pipelines. Set `OAFP_MODEL` in the same format as `OAF_MODEL`.

Examples:

```bash
# Use OpenAI via CLI
export OAFP_MODEL='(type: openai, key: "$OPENAI_KEY", model: gpt-4, temperature: 0.7, timeout: 900000)'

# Use Gemini via CLI
export OAFP_MODEL='(type: gemini, model: gemini-2.5-flash-preview-05-20, key: YOUR_GEMINI_KEY, timeout: 900000, temperature: 0, params: (tools: [(googleSearch: ())]))'
```

Then run:

```bash
# Single prompt
oafp in=llm data="Translate this to French: Hello, world"

# Prompt from stdin
TOPIC=love && printf "Write a poem about ${TOPIC}" | oafp in=llm
```

## Tips

- **Environment isolation**: Set these variables in your CI/CD or container environment to avoid hard-coding secrets.
- **SLON support**: OpenAF will parse SLON (Single Line Object Notation) if you prefer a more compact syntax.
- **Multiple providers**: You can switch providers by changing `type` (e.g. `openai`, `gemini`, `ollama`, `anthropic`).

## Reference Table: OAF_MODEL / OAFP_MODEL Fields

| Field | Type | Required | Description | Provider Notes |
|-------------- |----------|----------|-----------------------------------------------------------------------------|-------------------------------------------------|
| type | string | Yes | Provider type (e.g. `openai`, `gemini`, `ollama`, `anthropic`, etc.) | Must match supported provider |
| options | map | Yes | Provider-specific options (see below) | Structure varies by provider |
| tools | map | No | Tool definitions for function calling | OpenAI, Gemini, Anthropic |
| timeout | integer | No | Request timeout in milliseconds | All |
| noSystem | boolean | No | If true, suppress system messages in output | All |
| headers | map | No | Custom HTTP headers | All |
| params | map | No | Additional provider-specific parameters (e.g. `max_tokens`, `top_p`) | OpenAI, Gemini, Anthropic, Ollama |

**Provider-specific `options` fields:**

| Provider | Option | Type | Required | Description |
|------------|-------------|----------|----------|----------------------------------------------|
| openai | key | string | Yes | API key |
| openai | model | string | Yes | Model name (e.g. `gpt-3.5-turbo`) |
| openai | temperature | float | No | Sampling temperature |
| openai | url | string | No | API endpoint (default: OpenAI) |
| gemini | key | string | No | API key (if required) |
| gemini | model | string | Yes | Model name (e.g. `gemini-pro`) |
| ollama | model | string | Yes | Model name (e.g. `llama2`) |
| ollama | url | string | No | Ollama server URL |
| anthropic | key | string | Yes | API key |
| anthropic | model | string | Yes | Model name (e.g. `claude-3-opus-20240229`) |

> to be completed

## Advanced Usage Examples

### Using $llm with JSON Context

```javascript
// Pass a JSON context to the prompt
var context = {
user: "Alice",
data: [1, 2, 3, 4],
task: "Summarize the numbers above."
};
var response = $llm().withContext(context, "context data").prompt(
`Given the following context, provide a summary.`
)
print(response)
```

### Using $llm with Image Input

```javascript
// Prompt with an image (file path or base64)
var response = $llm().promptImage(
"Describe the contents of this image.",
"/path/to/image.png"
);
print(response);
```

### Using oafp CLI with JSON Context

```bash
# Pass JSON context as input
oafp data='{ user: "Bob", numbers: [5,6,7] }' llmcontext="numbers used by user" llmprompt: "Summarize the numbers for the user."}'
```

### Using oafp CLI with Image Input

```bash
# Prompt with an image (path or base64)
oafp in=llm data='What is in this image?' llmimage="/path/to/image.jpg"
```

---

## Further Reading

- [oJob LLM Integration](../ojob.md#job-llm)
- [OpenWrap AI API Reference](../api/openaf.md#owai-gpt)
198 changes: 198 additions & 0 deletions docs/guides/ojob/build-mcp-server.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,198 @@
---
layout: default
title: Build MCP servers with oJob
parent: oJob
grand_parent: Guides
---

# Build Model Context Protocol servers

OpenAF can expose [Model Context Protocol](https://modelcontextprotocol.io/) tools through regular oJobs. The `https://github.com/openaf/mini-a/mcps` directory in the distribution contains working examples (`mcp-db.yaml`, `mcp-ch.yaml`, …) and the shared runtime lives in `https://github.com/openaf/oJob-common/oJobMCP.yaml`. This guide distils the key pieces so you can author your own MCP server and test it locally or over HTTP.

## Understand the shared MCP runtime

`oJob-common/oJobMCP.yaml` publishes two shortcuts that do most of the heavy lifting:

- `httpdMCP` starts an HTTP JSON-RPC endpoint (defaults to `/mcp`) and wires requests to jobs you expose.
- `stdioMCP` speaks MCP over STDIO, allowing you to run the server as a child process.

Both variants expect:

- `description.serverInfo` with `name`, `title` and `version`.
- `fnsMeta`: MCP tool metadata (JSON Schema, annotations, etc.).
- `fns`: the mapping between MCP tool names and oJob job names.

Your YAML file only needs to prepare these maps and include the shortcut.

## Start from the MCP skeleton

Create a new file under `mini-a/mcps/`, e.g. `mcp-myservice.yaml`, following the structure below (adapted from `mini-a/mcps/CREATING.md`):

```yaml
# Author: You
help:
text : A STDIO/HTTP MCP server for MyService
expects:
- name : onport
desc : If defined starts an HTTP MCP server on the provided port
example : "8888"
mandatory: false
- name : myConfig
desc : Example extra parameter
example : "value"
mandatory: true

todo:
- Init MyService
- (if ): "isDef(args.onport)"
((then)):
- (httpdMCP): &MCPSERVER
description:
serverInfo:
name : mini-a-myservice
title : OpenAF MCP MyService
version: 1.0.0
((fnsMeta)): &MCPFNSMETA
myservice-tool:
name : myservice-tool
description: Performs the main operation
inputSchema:
type : object
properties:
param1:
type : string
description: Example parameter
required: [ param1 ]
annotations:
title : MyService Tool
readOnlyHint : true
idempotentHint: true
((fns )): &MCPFNS
myservice-tool: MyService Tool
((else)):
- (stdioMCP): *MCPSERVER
((fnsMeta)): *MCPFNSMETA
((fns )): *MCPFNS

ojob:
opacks:
- openaf : 20250915
- oJob-common: 20250914
daemon : true
argsFromEnvs: true
logToConsole: false

include:
- oJobMCP.yaml

jobs:
- name : Init MyService
check:
in:
myConfig: isString
exec : | #js
// Initialise SDK clients, cache credentials, etc.
global.myServiceConfig = args

- name : MyService Tool
check:
in:
param1: isString
exec : | #js
if (!isDef(global.myServiceConfig)) return "[ERROR] Service not initialised"
return {
content: [{
type: "text",
text: `Processed ${args.param1}`
}]
}

- name : Cleanup MyService
type : shutdown
exec : | #js
delete global.myServiceConfig
```

Key points:

- Keep the `help` section up to date so users know which arguments are required.
- The `todo` block first runs any initialisation job(s) and then chooses between HTTP or STDIO mode using the `onport` argument.
- `fnsMeta` entries must describe your tools with valid JSON Schema; `fns` maps the tool names to the job that should run.
- Always add a shutdown job for cleanup — it runs when the MCP exits.

## Tips for implementing tools

- Return errors as strings that start with `[ERROR] …`. The existing MCP tooling looks for that pattern.
- Use `check.in` validations so bad inputs fail fast.
- For read/write operations, consider a flag (e.g. `allowWrite`) to prevent accidental changes, similar to `mcp-ssh.yaml`.
- Inspect other MCPs (database, time, SSH, etc.) in `mini-a/mcps/` for real-world patterns.

## Smoke-test the server with oJob

Run in STDIO mode during development:

```bash
ojob mcps/mcp-myservice.yaml myConfig=value
```

Run as an HTTP server:

```bash
ojob mcps/mcp-myservice.yaml onport=12345 myConfig=value
```

Both rely on the shortcuts from `oJob-common/oJobMCP.yaml`. If you need extra HTTP middleware, add it before the `httpdMCP` call.

## Test with `$mcp` inside OpenAF

`openaf/js/openaf.js` ships the `$mcp` client helper. You can use it from the REPL or inside your scripts:

```javascript
var client = $mcp({
cmd : "ojob mcps/mcp-myservice.yaml myConfig=value",
debug : false,
strict: true
})

client.initialize()

log(client.listTools()) // -> { tools: [...] }

var result = client.callTool("myservice-tool", { param1: "demo" })
log(result)

client.destroy()
```

For HTTP mode, switch to `type: "remote"` and provide `url: "http://localhost:12345/mcp"`.

## Test with `oafp in=mcp`

`oafp` can act as a CLI MCP client (see `oafp/src/docs/USAGE.md`):

- List tools exposed by your STDIO server:

```bash
oafp in=mcp inmcptoolslist=true data="(cmd: 'ojob mcps/mcp-myservice.yaml myConfig=value')"
```

- Invoke a tool via STDIO:

```bash
oafp in=mcp data="(cmd: 'ojob mcps/mcp-myservice.yaml myConfig=value', tool: 'myservice-tool', params: (param1: 'demo'))"
```

- Invoke the same tool over HTTP:

```bash
oafp in=mcp data="(type: remote, url: 'http://localhost:12345/mcp', tool: 'myservice-tool', params: (param1: 'demo'))"
```

Set `inmcplistprompts=true` to inspect available prompts (if you expose any) or use the other connection fields (`__timeout__`, `__clientInfo__`, …) listed in the `oafp` usage document.

## Where to go next

- Browse the catalog in `https://github.com/openaf/mini-a/mcps/README.md` for inspiration.
- Revisit `https://github.com/openaf/mini-a/mcps/CREATING.md` whenever you need the full checklist.
- Extend `https://github.com/openaf/oJob-common/oJobMCP.yaml` if you need new shared behaviours (additional logging, authentication, etc.).

12 changes: 9 additions & 3 deletions docs/reference/db.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,13 @@ Closes the corresponding prepared statement. If an error occurs during this proc
__DB.commit()__

````
Commits to the database the current database session on the current DB object instance. In case of error an exception will be thrown.
Commits the current transaction associated with this DB object. An exception will be thrown if any database error occurs.

Example:

db.u("INSERT INTO A VALUES (1)");
db.commit();

````
### DB.convertDates

Expand Down Expand Up @@ -96,14 +102,14 @@ Stops a H2 server started for this DB instance.
__DB.q(aQuery) : Map__

````
Performs aQuery (SQL) on the current DB object instance. It returns a Map with a results array that will have an element per result set line. In case of error an exception will be thrown.
Executes the SQL query provided in aQuery on the current DB connection. It returns a Map with a results array where each element corresponds to one row. In case of error an exception will be thrown.
````
### DB.qLob

__DB.qLob(aSQL) : Object__

````
Performs aSQL query on the current DB object instance. It tries to return only the first result set row and the first object that can be either a CLOB (returns a string) or a BLOB (byte array). In case of error an exception will be thrown.
Executes aSQL on the current DB object returning only the first row and column. If that column is a CLOB a string is returned; if it is a BLOB a byte array is provided. Errors will raise an exception.
````
### DB.qs

Expand Down
Loading
Loading