Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 7 additions & 46 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
[![Ask DeepWiki](https://deepwiki.com/badge.svg "DeepWiki Documentation")](https://deepwiki.com/getjavelin/javelin-python)

## Javelin: an Enterprise-Scale, Fast LLM Gateway/Edge
## Highflame: Agent Security Platform SDK

This is the Python client package for Javelin.
This is the Python SDK package for Highflame.

For more information about Javelin, see https://getjavelin.com
For more information about Javelin, see https://www.highflame.com

Javelin Documentation: https://docs.getjavelin.io
Javelin Documentation: https://docs.highflame.ai

### Development

Expand All @@ -17,7 +17,7 @@ For local development, Please change `version = "RELEASE_VERSION"` with any sema
### Installation

```python
pip install javelin-sdk
pip install highflame
```

### Quick Start Guide
Expand Down Expand Up @@ -58,51 +58,12 @@ poetry install

```bash
# Uninstall any existing version
pip uninstall javelin-sdk -y
pip uninstall highflame -y

# Build the package
poetry build

# Install the newly built package
pip install dist/javelin_sdk-<version>-py3-none-any.whl
pip install dist/highflame-<version>-py3-none-any.whl
```

## [Universal Endpoints](https://docs.getjavelin.io/docs/javelin-core/integration#unified-endpoints)

Javelin provides universal endpoints that allow you to use a consistent interface across different LLM providers. Here are the main patterns:

#### Azure OpenAI
- [Basic Azure OpenAI integration](https://github.com/highflame-ai/javelin-python/blob/main/examples/azure-openai/azure-universal.py)
- [Universal endpoint implementation](https://github.com/highflame-ai/javelin-python/blob/main/examples/azure-openai/javelin_azureopenai_univ_endpoint.py)
- [OpenAI-compatible interface](https://github.com/highflame-ai/javelin-python/blob/main/examples/azure-openai/openai_compatible_univ_azure.py)

#### Bedrock
- [Basic Bedrock integration](https://github.com/highflame-ai/javelin-python/blob/main/examples/bedrock/bedrock_client_universal.py)
- [Universal endpoint implementation](https://github.com/highflame-ai/javelin-python/blob/main/examples/bedrock/javelin_bedrock_univ_endpoint.py)
- [OpenAI-compatible interface](https://github.com/highflame-ai/javelin-python/blob/main/examples/bedrock/openai_compatible_univ_bedrock.py)

#### Gemini
- [Basic Gemini integration](https://github.com/highflame-ai/javelin-python/blob/main/examples/gemini/gemini-universal.py)
- [Universal endpoint implementation](https://github.com/highflame-ai/javelin-python/blob/main/examples/gemini/javelin_gemini_univ_endpoint.py)
- [OpenAI-compatible interface](https://github.com/highflame-ai/javelin-python/blob/main/examples/gemini/openai_compatible_univ_gemini.py)

### Agent Examples
- [CrewAI integration](https://github.com/highflame-ai/javelin-python/blob/main/examples/agents/crewai_javelin.ipynb)
- [LangGraph integration](https://github.com/highflame-ai/javelin-python/blob/main/examples/agents/langgraph_javelin.ipynb)

### Basic Examples
- [Asynchronous example](https://github.com/highflame-ai/javelin-python/blob/main/examples/route_examples/aexample.py)
- [Synchronous example](https://github.com/highflame-ai/javelin-python/blob/main/examples/route_examples/example.py)
- [Drop-in replacement example](https://github.com/highflame-ai/javelin-python/blob/main/examples/route_examples/drop_in_replacement.py)

### Advanced Examples
- [Document processing](https://github.com/highflame-ai/javelin-python/blob/main/examples/gemini/document_processing.py)
- [RAG implementation](https://github.com/highflame-ai/javelin-python/blob/main/examples/rag/javelin_rag_embeddings_demo.ipynb)

## Additional Integration Patterns

For more detailed examples and integration patterns, check out:

- [Azure OpenAI Integration](https://docs.getjavelin.io/docs/javelin-core/integration#2-azure-openai-api-endpoints)
- [AWS Bedrock Integration](https://docs.getjavelin.io/docs/javelin-core/integration#3-aws-bedrock-api-endpoints)
- [Supported Language Models](https://docs.getjavelin.io/docs/javelin-core/supported-llms)
35 changes: 35 additions & 0 deletions v2/CLI_PYPROJECT.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# This file shows what the future CLI-only package pyproject.toml would look like
# Once CLI is separated into its own package: highflame-cli
# This serves as a reference for the CLI package separation plan

[tool.poetry]
name = "highflame-cli"
version = "2.0.0"
description = "Command-line interface for Highflame - LLM Gateway Management"
authors = ["Sharath Rajasekar <sharath@highflame.com>"]
readme = "README.md"
license = "Apache-2.0"
homepage = "https://highflame.com"
repository = "https://github.com/highflame-ai/highflame-cli"
packages = [
{ include = "highflame_cli" },
]

[tool.poetry.scripts]
highflame = "highflame_cli.cli:main"

[tool.poetry.dependencies]
python = "^3.9"
highflame = "^2.0.0"
requests = "^2.32.3"

[tool.poetry.group.dev.dependencies]
black = "24.3.0"
flake8 = "^7.3.0"
pre-commit = "^3.3.1"
pytest = "^8.3.5"
pytest-mock = "^3.10.0"

[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
Loading
Loading