- 
                Notifications
    
You must be signed in to change notification settings  - Fork 13.5k
 
Home
        wzy edited this page Jul 22, 2023 
        ·
        10 revisions
      
    Welcome to the llama.cpp wiki!
yay -S llama-cpp
yay -S llama-cpp-cuda
yay -S llama-cpp-openclnix run github#ggerganov/llama.cppWait https://github.com/termux/termux-packages/pull/17457.
apt install llama-cppWait https://github.com/msys2/MINGW-packages/issues/17808.
pacman -S llama-cppgit clone --depth=1 https://github.com/ggerganov/llama.cpp
cd llama.cpp
cmake -Bbuild
cmake --build build -D...
cd build
cpack -G DEB
dpkg -i *.debgit clone --depth=1 https://github.com/ggerganov/llama.cpp
cd llama.cpp
cmake -Bbuild
cmake --build build -D...
cd build
cpack -G RPM
dpkg -i *.rpmUseful information for users that doesn't fit into Readme.
- Home
 - Feature Matrix
 - GGML Tips & Tricks
 - Chat Templating
 - Metadata Override
 - HuggingFace Model Card Metadata Interoperability Consideration
 
These are information useful for Maintainers and Developers which does not fit into code comments
Click on a badge to jump to workflow. This is here as a useful general view of all the actions so that we may notice quicker if main branch automation is broken and where.