Dataset Viewer
The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
Precompiled Wheels
Prebuilt Python wheels for CUDA and PyTorch combinations that are awkward to build locally.
Filename convention:
<package>-<version>+cu<cuda>.torch<torch_ver>-<py_tag>-<abi_tag>-<platform>.whl
Example:
sageattention-2.2.0+cu130.torch2.11-cp313-cp313-win_amd64.whl
This means:
- CUDA
13.0 - PyTorch
2.11 - Python
3.13 - Windows
x86_64
Files
Windows x86_64
| File | Notes |
|---|---|
deep_gemm-2.3.0+cu128.torch2.10-cp310-cp310-win_amd64.whl |
DeepGEMM, Python 3.10, CUDA 12.8, Torch 2.10 |
flash_attn-2.8.3-cp311-cp311-win_amd64.whl |
FlashAttention 2, Python 3.11 |
sageattention-2.2.0+cu128.torch2.10-cp310-cp310-win_amd64.whl |
SageAttention 2.2.0, Python 3.10, CUDA 12.8, Torch 2.10 |
sageattention-2.2.0+cu130.torch2.11-cp312-cp312-win_amd64.whl |
SageAttention 2.2.0, Python 3.12, CUDA 13.0, Torch 2.11 |
sageattention-2.2.0+cu130.torch2.11-cp313-cp313-win_amd64.whl |
SageAttention 2.2.0, Python 3.13, CUDA 13.0, Torch 2.11 |
xformers-0.0.35+cu130.torch2.11-py39-none-win_amd64.whl |
Existing stable-ABI xformers wheel |
xformers-0.0.35+cu130.torch2.11-cp312-cp312-win_amd64.whl |
Repacked xformers wheel for Python 3.12 |
xformers-0.0.35+cu130.torch2.11-cp313-cp313-win_amd64.whl |
Repacked xformers wheel for Python 3.13 |
Linux aarch64
| File | Notes |
|---|---|
flash_attn-2.8.3-cp310-cp310-linux_aarch64.whl |
FlashAttention 2, Python 3.10 |
flash_attn-2.8.4+cu130.torch2.11-cp312-cp312-linux_aarch64.whl |
FlashAttention 2, Python 3.12, CUDA 13.0, Torch 2.11 |
flash_attn-2.8.4+cu130.torch2.11-cp313-cp313-linux_aarch64.whl |
FlashAttention 2, Python 3.13, CUDA 13.0, Torch 2.11 |
sageattention-2.2.0+cu130.torch2.11-cp312-cp312-linux_aarch64.whl |
SageAttention 2.2.0, Python 3.12, CUDA 13.0, Torch 2.11 |
sageattention-2.2.0+cu130.torch2.11-cp313-cp313-linux_aarch64.whl |
SageAttention 2.2.0, Python 3.13, CUDA 13.0, Torch 2.11 |
xformers-0.0.35+cu130.torch2.11-py39-none-linux_aarch64.whl |
Existing stable-ABI xformers wheel |
xformers-0.0.35+cu130.torch2.11-cp312-cp312-linux_aarch64.whl |
Repacked xformers wheel for Python 3.12 |
xformers-0.0.35+cu130.torch2.11-cp313-cp313-linux_aarch64.whl |
Repacked xformers wheel for Python 3.13 |
Pure Python / Any platform
| File | Notes |
|---|---|
angelslim-0.0.0.dev0+cu128.torch2.10-py3-none-any.whl |
Pure Python wheel |
Install examples
Windows, Python 3.12 / Torch 2.11 / CUDA 13.0
pip install --index-url https://download.pytorch.org/whl/cu130 torch==2.11.0+cu130
$base = "https://huggingface.co/datasets/tori29umai/PrecompiledWheels/resolve/main"
pip install "$base/sageattention-2.2.0+cu130.torch2.11-cp312-cp312-win_amd64.whl"
pip install "$base/xformers-0.0.35+cu130.torch2.11-cp312-cp312-win_amd64.whl"
Windows, Python 3.13 / Torch 2.11 / CUDA 13.0
pip install --index-url https://download.pytorch.org/whl/cu130 torch==2.11.0+cu130
$base = "https://huggingface.co/datasets/tori29umai/PrecompiledWheels/resolve/main"
pip install "$base/sageattention-2.2.0+cu130.torch2.11-cp313-cp313-win_amd64.whl"
pip install "$base/xformers-0.0.35+cu130.torch2.11-cp313-cp313-win_amd64.whl"
Linux aarch64, Python 3.12 / Torch 2.11 / CUDA 13.0
pip install --index-url https://download.pytorch.org/whl/cu130 torch==2.11.0+cu130
BASE=https://huggingface.co/datasets/tori29umai/PrecompiledWheels/resolve/main
pip install "$BASE/flash_attn-2.8.4+cu130.torch2.11-cp312-cp312-linux_aarch64.whl"
pip install "$BASE/sageattention-2.2.0+cu130.torch2.11-cp312-cp312-linux_aarch64.whl"
pip install "$BASE/xformers-0.0.35+cu130.torch2.11-cp312-cp312-linux_aarch64.whl"
Linux aarch64, Python 3.13 / Torch 2.11 / CUDA 13.0
pip install --index-url https://download.pytorch.org/whl/cu130 torch==2.11.0+cu130
BASE=https://huggingface.co/datasets/tori29umai/PrecompiledWheels/resolve/main
pip install "$BASE/flash_attn-2.8.4+cu130.torch2.11-cp313-cp313-linux_aarch64.whl"
pip install "$BASE/sageattention-2.2.0+cu130.torch2.11-cp313-cp313-linux_aarch64.whl"
pip install "$BASE/xformers-0.0.35+cu130.torch2.11-cp313-cp313-linux_aarch64.whl"
Notes
- These wheels are redistributed for convenience.
- Please check each upstream project's license before reuse.
Upstream projects
| Package | Repository |
|---|---|
flash_attn |
https://github.com/Dao-AILab/flash-attention |
sageattention |
https://github.com/thu-ml/SageAttention |
xformers |
https://github.com/facebookresearch/xformers |
deep_gemm |
https://github.com/deepseek-ai/DeepGEMM |
angelslim |
https://github.com/Tencent/AngelSlim |
- Downloads last month
- 98