Created
September 12, 2023 16:19
-
-
Save LeagueRaINi/b912b2d900ad127c03971c7422290ca0 to your computer and use it in GitHub Desktop.
Clones and compiles llama.cpp with the rocm windows sdk for amd gpu acceleration on windows! yay!
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
@echo off | |
:: Path to the bin directory of the rocm windows sdk (change path if necessary), | |
:: set ROCM_BIN=C:\Program Files\AMD\ROCm\5.5\bin | |
set ROCM_BIN=%HIP_PATH%\bin | |
set LLAMA_DIR=%CD%\llama.cpp | |
set LLAMA_BIN_OUT=%CD%\llama.cpp.bin | |
:: By default the bin directory of the rocm windows sdk does not seem to be in path so we add it | |
set PATH=%ROCM_BIN%;%PATH% | |
:: Clone llama.cpp | |
git clone https://github.com/ggerganov/llama.cpp.git "%LLAMA_DIR%" | |
:: Create build files in release mode | |
cmake -B "%LLAMA_DIR%\build" -S "%LLAMA_DIR%" ^ | |
-DCMAKE_BUILD_TYPE=Release ^ | |
-DCMAKE_C_COMPILER=clang ^ | |
-DCMAKE_CXX_COMPILER=clang++ ^ | |
-DLLAMA_NATIVE=ON ^ | |
-DLLAMA_HIPBLAS=ON ^ | |
-DAMDGPU_TARGETS=gfx906;gfx1030;gfx1100;gfx1101;gfx1102 ^ | |
-G Ninja | |
:: Build llama.cpp | |
cmake --build "%LLAMA_DIR%\build" | |
:: Copy the binaries we build | |
xcopy "%LLAMA_DIR%\build\bin" "%LLAMA_BIN_OUT%" /s /I | |
:: Copy rocm files needed to run the compiled binary (not needed if the ROCm/5.5/bin dir is in PATH) | |
xcopy "%ROCM_BIN%\rocblas" "%LLAMA_BIN_OUT%\rocblas" /s /I | |
xcopy "%ROCM_BIN%\rocblas.dll" "%LLAMA_BIN_OUT%" | |
xcopy "%ROCM_BIN%\hipblas.dll" "%LLAMA_BIN_OUT%" |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment