Model card for LEMON
LEMON (Learning Embeddings from Morphology Of Nuclei) is an open-source foundation model for single-cell histology images, presented in the paper LEMON: a foundation model for nuclear morphology in Computational Pathology.
The model is a Vision Transformer (ViT-s/8) trained using self-supervised learning on a dataset of 10 million histology cell images sampled from 10,000 slides from TCGA.
LEMON can be used to extract robust features from single-cell histology images for various downstream applications, such as gene expression prediction or cell type classification.
How to use it to extract features
The code below can be used to run inference. LEMON expects images of size 40x40 that were extracted at 0.25 microns per pixel (40X). Note that the code requires the model.py script provided in the repository.
import torch
from pathlib import Path
from torchvision.transforms import ToPILImage
from model import prepare_transform, get_vit_feature_extractor
device = "cpu"
model_name = "vits8"
target_cell_size = 40
weight_path = Path("lemon.pth.tar")
stats_path = Path("mean_std.json")
# Model
transform = prepare_transform(stats_path, size=target_cell_size)
model = get_vit_feature_extractor(weight_path, model_name, img_size=target_cell_size)
model.eval()
model.to(device)
# Data
input = torch.rand(3, target_cell_size, target_cell_size)
input = ToPILImage()(input)
# Inference
with torch.autocast(device_type=device, dtype=torch.float16):
with torch.inference_mode():
features = model(transform(input).unsqueeze(0).to(device))
assert features.shape == (1, 384)
Citation
If you found our work useful in your research, please consider citing our work:
@misc{chadoutaud2026lemonfoundationmodelnuclear,
title={LEMON: a foundation model for nuclear morphology in Computational Pathology},
author={Loïc Chadoutaud and Alice Blondel and Hana Feki and Jacqueline Fontugne and Emmanuel Barillot and Thomas Walter},
year={2026},
eprint={2603.25802},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2603.25802},
}