2025

reading list - 2025

llama.cpp in Docker

build llama.cpp inside a Docker container with AMD ROCm support