Skip to content

llama-cpp-python#

Available modules#

The overview below shows which llama-cpp-python installations are available per HPC-UGent Tier-2 cluster, ordered based on software version (new to old).

To start using llama-cpp-python, load one of these modules using a module load command like:

module load llama-cpp-python/0.3.2-gfbf-2023a-CUDA-12.1.1

(This data was automatically generated on Tue, 24 Dec 2024 at 15:44:51 CET)

accelgor doduo donphan gallade joltik shinx skitty
llama-cpp-python/0.3.2-gfbf-2023a-CUDA-12.1.1 x - x - x - -
llama-cpp-python/0.3.2-gfbf-2023a x - x - x - -