Control your computer remotely through Telegram using AI-powered natural language commands
Botty is an intelligent assistant that combines Ollama's LLM capabilities with a powerful tool system, allowing you to control your computer through a beautiful web interface or Telegram bot using natural language.
- π§ AI-Powered: Uses Ollama with llama3.2 for intelligent command interpretation
- π οΈ Native Tool System: Uses Ollama's native function calling (not prompt hacking)
- π¬ Dual Interface: Web app + Telegram bot for remote control
- π Secure by Design: Whitelist-based authentication and command sandboxing
- β‘ Real-time Streaming: See AI responses as they're generated
- π― Smart Tools: Execute commands, manage files, create reminders, and more
| Tool | Description | Example |
|---|---|---|
run_command |
Execute safe system commands | "ejecuta git status" |
read_file |
Read file contents | "lee el package.json" |
list_directory |
List directory contents | "lista los archivos" |
create_post |
Create text reminders | "recuΓ©rdame comprar leche" |
google_search |
Open Chrome with search | "busca recetas de pasta" |
- Node.js 18+
- pnpm 8+
- Ollama installed and running
# Clone the repository
git clone https://github.com/yourusername/botty.git
cd botty
# Install dependencies
pnpm install
# Pull Ollama model
ollama pull llama3.2:3b
# Configure environment
cp backend/.env.example backend/.env
cp frontend/.env.example frontend/.env
# Edit backend/.env with your settings
nano backend/.envPORT=3000
FRONTEND_URL=http://localhost:5173
OLLAMA_MODEL=llama3.2:3b
OLLAMA_URL=http://localhost:11434
# Telegram Bot (optional)
TELEGRAM_BOT_TOKEN=your_bot_token_from_botfather
TELEGRAM_ALLOWED_USERS=your_telegram_user_idVITE_WS_URL=http://localhost:3000# Start both backend and frontend
pnpm dev
# Or start them separately
pnpm --filter botty-backend dev
pnpm --filter botty-frontend devAccess the web app at: http://localhost:5173
-
Create bot: Talk to @BotFather on Telegram
- Send
/newbot - Choose name and username
- Copy the token
- Send
-
Get your User ID: Talk to @userinfobot
- Send
/start - Copy your user ID
- Send
-
Configure: Add to
backend/.envTELEGRAM_BOT_TOKEN=123456789:ABC-DEF... TELEGRAM_ALLOWED_USERS=123456789
-
Use: Find your bot on Telegram and start chatting!
/start- Welcome message/help- Show help/tools- List available tools/status- System status/clear- Clear conversation history
βββββββββββββββ ββββββββββββββββ βββββββββββββββ
β Telegram ββββββββββΆβ Backend ββββββββββΆβ Ollama β
β Bot β β Express.js β β llama3.2 β
βββββββββββββββ β Socket.io β βββββββββββββββ
β β β
βββββββββββββββ β Tools ββββββββββββββββββ
β Vue.js ββββββββββΆβ Registry β
β Frontend β ββββββββββββββββ
βββββββββββββββ
- Vue 3 - Progressive JavaScript framework
- Pinia - State management
- Socket.io Client - Real-time communication
- Vite - Build tool
- Express.js - Web server
- Socket.io - WebSocket server
- Ollama - LLM integration
- node-telegram-bot-api - Telegram bot
- Ollama - Local LLM runtime
- llama3.2:3b - Language model
- Native Tools - Function calling API
botty/
βββ backend/
β βββ src/
β β βββ ai/ # Ollama client
β β βββ tools/ # Tool definitions & registry
β β βββ telegram/ # Telegram bot service
β β βββ websocket/ # WebSocket handlers
β β βββ middleware/ # Express middleware
β β βββ server.js # Main server
β βββ posts/ # Generated files
βββ frontend/
β βββ src/
β β βββ components/ # Vue components
β β βββ composables/ # Vue composables
β β βββ utils/ # Utilities
β β βββ App.vue # Root component
β βββ index.html
βββ pnpm-workspace.yaml
- Command Whitelist: Only safe, read-only commands allowed
- User Authentication: Telegram user ID whitelist
- No Destructive Actions: Commands like
rm,sudoare blocked - Input Sanitization: File paths and parameters are validated
- Timeout Protection: Commands timeout after 30 seconds
β rm, dd, chmod, sudo, shutdown, kill, apt install, wget, curl, ssh
β
ls, pwd, cat, git status, date, ps, grep, find, echo, etc.
Create a new tool in backend/src/tools/:
export const myTool = {
name: 'my_tool',
description: 'What this tool does',
parameters: {
type: 'object',
properties: {
param1: {
type: 'string',
description: 'Parameter description'
}
},
required: ['param1']
},
async execute(params) {
// Your tool logic here
return {
success: true,
message: 'Tool executed',
data: params.param1
};
}
};Register in backend/src/tools/tool-registry.js:
import { myTool } from './my-tool.tool.js';
this.registerTool(myTool);- Voice message support in Telegram
- File upload/download via Telegram
- Docker deployment
- Multi-user support with sessions
- Advanced tool: Browser automation (Puppeteer)
- Advanced tool: Database queries
- Advanced tool: API calls
- Web authentication for frontend
- Conversation export
- Tool usage analytics
# Check if Ollama is running
curl http://localhost:11434/api/version
# Start Ollama
ollama serve- Verify token in
.env - Check user ID is in
TELEGRAM_ALLOWED_USERS - Check backend logs for errors
- Verify backend is running on port 3000
- Check CORS settings in
backend/.env - Verify WebSocket connection in browser console
MIT Β© [Your Name]
- Ollama - For the amazing local LLM runtime
- Anthropic - For inspiration from Claude's tool use
- Telegram - For the bot API
β If you found this helpful, please star the repo!
Made with β€οΈ and AI