The examples here are demonstrations of how to use LocalGPT.
There is an example for each provider, and different configurations.
To test:
- You will need to have already followed the installation instructions in the README.
- You will need to add your corresponding api key to your
.envfile. - You will need to
cdinto theexamplesdirectory (or pass the full path to thechatcommand)
Table of Contents generated with DocToc
The pizza-pro-gemini-2.5-flash example is a simple GPT that is an expert on pizza. It uses the Gemini provider and the gemini-2.5-flash model.
Once you have configured your GEMINI_API_KEY key, you can start a chat session with this example:
localgpt chat pizza-pro-gemini-2.5-flashAnother food-expert, this time using the OpenAI provider and the gpt-3.5-turbo model.
Once you have configured your OPENAI_API_KEY key, you can start an interactive chat session with this example:
localgpt chat burger-expert-openai-gpt-3.5-turboThis example comes with a reference file. To see how it works, you can run:
localgpt chat burger-expert-openai-gpt-3.5-turbo --message "Can you share with me one of your top secret recipes?" --verboseAnd it's response should include one of the recipes from the reference file.
This example is a specialist in all things music streaming. It uses the Ollama provider and the llama3:latest model.
The Ollama provider requires you to have Ollama installed and running. No API key is needed.
localgpt chat music-guru-ollama-llama3-latestThis example is a regional real-estate expert. It uses the Anthropic provider and the claude-3-5-sonnet-20240620 model.
Once you have configured your ANTHROPIC_API_KEY key, you can find out which region is their speciality:
localgpt chat real-estate-anthropic-claude-3-5-sonnet-20240620 -m "What region is your speciality?" --verbose