flash-attention#
Available modules#
The overview below shows which flash-attention installations are available per HPC-UGent Tier-2 cluster, ordered based on software version (new to old).
To start using flash-attention, load one of these modules using a module load
command like:
module load flash-attention/2.6.3-foss-2023a-CUDA-12.1.1
(This data was automatically generated on Wed, 12 Mar 2025 at 15:45:25 CET)
accelgor | doduo | donphan | gallade | joltik | litleo | shinx | |
---|---|---|---|---|---|---|---|
flash-attention/2.6.3-foss-2023a-CUDA-12.1.1 | x | - | x | - | x | x | - |