A local social bot powered by your chosen LLM (Large Language Model). This project enables you to run a conversational bot on your machine, ensuring privacy and control.
(Ollama service should be running on your local machine, you can pick another ollama supported model in settings.py [llama, llama3.1 or llama3.2 - whatever you prefer - ollama pull model first])
- Local inference with an LLM
- Social interaction capabilities (if you add any)
- Easy setup and launch
- Python 3.8+
- Astral's U Manager
-
Clone the repository:
git clone https://github.com/Sparky4567/social_bot.git cd social_bot ollama pull gemma3:270m -
curl -LsSf https://astral.sh/uv/install.sh | sh
-
uv run main.py
To start the bot, use Astral's U Manager:
sudo apt-get update && sudo apt-get install build-essential portaudio19-dev python3-dev
uv run main.py
uv run streamlit run gui.py
Edit settings.py to set your model path and parameters.
MIT