# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("staeiou/bartleby-dlo-qwen3.5-2b-base-cpt")
model = AutoModelForCausalLM.from_pretrained("staeiou/bartleby-dlo-qwen3.5-2b-base-cpt")Quick Links
BartlebyGPT Dead Letter Office (DLO-Base)
The BartlebyGPT Dead Letter Office (DLO-Bae) is a continued pretraining (CPT) of Qwen/Qwen3.5-2B-Base. CPT was run on ~62M tokens of Melvillian prose, over 1 epoch with TRL.
It is not trained for instruction following or conversation. It is intended to be fine-tuned.
- Downloads last month
- 34
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="staeiou/bartleby-dlo-qwen3.5-2b-base-cpt")