Skip to content

OctopusSolutionsEngineering/OllamaInference

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 

Repository files navigation

A Docker image based off the instructions at https://cloud.google.com/run/docs/tutorials/gpu-gemma-with-ollama.

Running the image

docker run -p 8080:8080 ghcr.io/octopussolutionsengineering/ollamainference:latest

Testing the container

curl -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gemma3:1b",
    "messages": [
      {
        "role": "user",
        "content": "What is the capital of France?"
      }
    ]
  }'

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages