File size: 481 Bytes
3d7ffce
 
 
 
 
 
 
 
 
 
9affc54
3d7ffce
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen3.5-2B-Base/blob/main/LICENSE
base_model:
- Qwen/Qwen3.5-2B-Base
---

# BartlebyGPT Dead Letter Office (DLO-Base)

The BartlebyGPT Dead Letter Office (DLO-Bae) is a continued pretraining (CPT) of Qwen/Qwen3.5-2B-Base. CPT was run on ~62M tokens of Melvillian prose, over 1 epoch with TRL.

It is not trained for instruction following or conversation. It is intended to be fine-tuned.