EMoE: Eigenbasis-Guided Routing for Mixture-of-Experts
Paper
• 2601.12137 • Published
This repository hosts pretrained checkpoints for EMoE and a Hub-compatible loading path.
Paper: https://arxiv.org/abs/2601.12137 or https://huggingface.co/papers/2601.12137
Code: https://github.com/Belis0811/EMoE
model.safetensors: EMoE ViT-Base in standard Hub format (vit_base_patch16_224, ImageNet-1k)eigen_moe_vit_base_patch16_224_imagenet1k.ptheigen_moe_vit_large_patch16_224.augreg_in21k_ft_in1k_imagenet1k.ptheigen_moe_vit_huge_patch14_224_in21k_imagenet1k.pthInstall dependencies:
pip install -U torch timm huggingface_hub safetensors
Load the Hub-formatted checkpoint:
import torch
from eigen_moe import HFEigenMoE
model = HFEigenMoE.from_pretrained(
"anzheCheng/EMoE",
vit_model_name="vit_base_patch16_224",
num_classes=1000,
strict=False,
)
model.eval()
x = torch.randn(1, 3, 224, 224)
with torch.no_grad():
logits = model(x)
print(logits.shape)
Load one of the original .pth files explicitly:
model = HFEigenMoE.from_pretrained(
"anzheCheng/EMoE",
vit_model_name="vit_large_patch16_224.augreg_in21k_ft_in1k",
num_classes=1000,
checkpoint_filename="eigen_moe_vit_large_patch16_224.augreg_in21k_ft_in1k_imagenet1k.pth",
strict=False,
)
@article{cheng2026emoe,
title={EMoE: Eigenbasis-Guided Routing for Mixture-of-Experts},
author={Cheng, Anzhe and Duan, Shukai and Li, Shixuan and Yin, Chenzhong and Cheng, Mingxi and Nazarian, Shahin and Thompson, Paul and Bogdan, Paul},
journal={arXiv preprint arXiv:2601.12137},
year={2026}
}