Skip to content

Sparky4567/social_bot

Repository files navigation

Social Local LLM Bot

A local social bot powered by your chosen LLM (Large Language Model). This project enables you to run a conversational bot on your machine, ensuring privacy and control.

(Ollama service should be running on your local machine, you can pick another ollama supported model in settings.py [llama, llama3.1 or llama3.2 - whatever you prefer - ollama pull model first])

Features

  • Local inference with an LLM
  • Social interaction capabilities (if you add any)
  • Easy setup and launch

Requirements

Installation

  1. Clone the repository:

    git clone https://github.com/Sparky4567/social_bot.git
    cd social_bot
    ollama pull gemma3:270m
  2. curl -LsSf https://astral.sh/uv/install.sh | sh

  3. uv run main.py

Launch Instructions

To start the bot, use Astral's U Manager:

sudo apt-get update && sudo apt-get install build-essential portaudio19-dev python3-dev

No GUI mode

uv run main.py

GUI mode (Streamlit) - Background tasks still logged into terminal

uv run streamlit run gui.py

Configuration

Edit settings.py to set your model path and parameters.

License

MIT

About

social_bot - bare bones local backend solution to create v-tubers and llm assistants

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages