A LLM powered Telegram bot.
First you need to create a Telegram bot to interact with. See instructions here.
You need a LLM in GGUF format. You can find quite a few through TheBloke on HuggingFace who has done an enormous service to the community by converting different models to GGUF and quantized them. Pick one that suits your needs and hardware requirements.
You need Llama.cpp in order to be able to run a model on your machine. There are other alternatives out there like Ollama and LM Studio, but I prefer llama.cpp due to it being lightweight. It's easy to install through Brew.
$ brew install lama.cppYou need Python 3 on your machine.
- Micromamba (optional)
For handling the packages needed for different enivronments. Easy to install with asdf.
$ asdf install
$ micromamba create -n jensen python=3.13
$ micromamba activate jensen- LLMs, Telegram etc
For using LLM models, Telegram API etc
$ pip install -r requirements.txtCopy ./start-llama-cpp-server-example.sh to ./start-llama-cpp-server.sh and enter the model and settings you want.
Create a .env file with the following properties.
# Telegram
API_KEY=<api key, string> (mandatory)
POLL_INTERVAL=<interval to use when polling Telegram as seconds, float>
Start the llama.cpp server:
./start-llama-cpp-server.shStart the application:
python ./jensen/app.pyAnd you're off to the races.