Skip to content

Control your computer remotely with AI through Telegram

Notifications You must be signed in to change notification settings

manusanchev/botty

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ€– Botty - AI Assistant with Remote Control

Control your computer remotely through Telegram using AI-powered natural language commands

Botty is an intelligent assistant that combines Ollama's LLM capabilities with a powerful tool system, allowing you to control your computer through a beautiful web interface or Telegram bot using natural language.

License: MIT Node.js Ollama

✨ Features

  • 🧠 AI-Powered: Uses Ollama with llama3.2 for intelligent command interpretation
  • πŸ› οΈ Native Tool System: Uses Ollama's native function calling (not prompt hacking)
  • πŸ’¬ Dual Interface: Web app + Telegram bot for remote control
  • πŸ”’ Secure by Design: Whitelist-based authentication and command sandboxing
  • ⚑ Real-time Streaming: See AI responses as they're generated
  • 🎯 Smart Tools: Execute commands, manage files, create reminders, and more

🎯 Available Tools

Tool Description Example
run_command Execute safe system commands "ejecuta git status"
read_file Read file contents "lee el package.json"
list_directory List directory contents "lista los archivos"
create_post Create text reminders "recuΓ©rdame comprar leche"
google_search Open Chrome with search "busca recetas de pasta"

πŸš€ Quick Start

Prerequisites

  • Node.js 18+
  • pnpm 8+
  • Ollama installed and running

Installation

# Clone the repository
git clone https://github.com/yourusername/botty.git
cd botty

# Install dependencies
pnpm install

# Pull Ollama model
ollama pull llama3.2:3b

# Configure environment
cp backend/.env.example backend/.env
cp frontend/.env.example frontend/.env

# Edit backend/.env with your settings
nano backend/.env

Configuration

Backend (.env)

PORT=3000
FRONTEND_URL=http://localhost:5173
OLLAMA_MODEL=llama3.2:3b
OLLAMA_URL=http://localhost:11434

# Telegram Bot (optional)
TELEGRAM_BOT_TOKEN=your_bot_token_from_botfather
TELEGRAM_ALLOWED_USERS=your_telegram_user_id

Frontend (.env)

VITE_WS_URL=http://localhost:3000

Running

# Start both backend and frontend
pnpm dev

# Or start them separately
pnpm --filter botty-backend dev
pnpm --filter botty-frontend dev

Access the web app at: http://localhost:5173

πŸ“± Telegram Bot Setup

  1. Create bot: Talk to @BotFather on Telegram

    • Send /newbot
    • Choose name and username
    • Copy the token
  2. Get your User ID: Talk to @userinfobot

    • Send /start
    • Copy your user ID
  3. Configure: Add to backend/.env

    TELEGRAM_BOT_TOKEN=123456789:ABC-DEF...
    TELEGRAM_ALLOWED_USERS=123456789
  4. Use: Find your bot on Telegram and start chatting!

Telegram Commands

  • /start - Welcome message
  • /help - Show help
  • /tools - List available tools
  • /status - System status
  • /clear - Clear conversation history

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Telegram  │────────▢│    Backend   │────────▢│   Ollama    β”‚
β”‚     Bot     β”‚         β”‚  Express.js  β”‚         β”‚  llama3.2   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚   Socket.io  β”‚         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                        β”‚              β”‚                β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚    Tools     β”‚β—€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
β”‚   Vue.js    │────────▢│   Registry   β”‚
β”‚   Frontend  β”‚         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ› οΈ Tech Stack

Frontend

  • Vue 3 - Progressive JavaScript framework
  • Pinia - State management
  • Socket.io Client - Real-time communication
  • Vite - Build tool

Backend

  • Express.js - Web server
  • Socket.io - WebSocket server
  • Ollama - LLM integration
  • node-telegram-bot-api - Telegram bot

AI

  • Ollama - Local LLM runtime
  • llama3.2:3b - Language model
  • Native Tools - Function calling API

πŸ“ Project Structure

botty/
β”œβ”€β”€ backend/
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ ai/              # Ollama client
β”‚   β”‚   β”œβ”€β”€ tools/           # Tool definitions & registry
β”‚   β”‚   β”œβ”€β”€ telegram/        # Telegram bot service
β”‚   β”‚   β”œβ”€β”€ websocket/       # WebSocket handlers
β”‚   β”‚   β”œβ”€β”€ middleware/      # Express middleware
β”‚   β”‚   └── server.js        # Main server
β”‚   └── posts/               # Generated files
β”œβ”€β”€ frontend/
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ components/      # Vue components
β”‚   β”‚   β”œβ”€β”€ composables/     # Vue composables
β”‚   β”‚   β”œβ”€β”€ utils/           # Utilities
β”‚   β”‚   └── App.vue          # Root component
β”‚   └── index.html
└── pnpm-workspace.yaml

πŸ”’ Security

  • Command Whitelist: Only safe, read-only commands allowed
  • User Authentication: Telegram user ID whitelist
  • No Destructive Actions: Commands like rm, sudo are blocked
  • Input Sanitization: File paths and parameters are validated
  • Timeout Protection: Commands timeout after 30 seconds

Blocked Commands

❌ rm, dd, chmod, sudo, shutdown, kill, apt install, wget, curl, ssh

Allowed Commands

βœ… ls, pwd, cat, git status, date, ps, grep, find, echo, etc.

🎨 Adding Custom Tools

Create a new tool in backend/src/tools/:

export const myTool = {
  name: 'my_tool',
  description: 'What this tool does',
  parameters: {
    type: 'object',
    properties: {
      param1: {
        type: 'string',
        description: 'Parameter description'
      }
    },
    required: ['param1']
  },

  async execute(params) {
    // Your tool logic here
    return {
      success: true,
      message: 'Tool executed',
      data: params.param1
    };
  }
};

Register in backend/src/tools/tool-registry.js:

import { myTool } from './my-tool.tool.js';
this.registerTool(myTool);

πŸ—ΊοΈ Roadmap

  • Voice message support in Telegram
  • File upload/download via Telegram
  • Docker deployment
  • Multi-user support with sessions
  • Advanced tool: Browser automation (Puppeteer)
  • Advanced tool: Database queries
  • Advanced tool: API calls
  • Web authentication for frontend
  • Conversation export
  • Tool usage analytics

πŸ› Troubleshooting

Ollama not connecting

# Check if Ollama is running
curl http://localhost:11434/api/version

# Start Ollama
ollama serve

Telegram bot not responding

  • Verify token in .env
  • Check user ID is in TELEGRAM_ALLOWED_USERS
  • Check backend logs for errors

Frontend can't connect

  • Verify backend is running on port 3000
  • Check CORS settings in backend/.env
  • Verify WebSocket connection in browser console

πŸ“„ License

MIT Β© [Your Name]

πŸ™ Acknowledgments

  • Ollama - For the amazing local LLM runtime
  • Anthropic - For inspiration from Claude's tool use
  • Telegram - For the bot API

⭐ If you found this helpful, please star the repo!

Made with ❀️ and AI

About

Control your computer remotely with AI through Telegram

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published