metadata
license: mit
language:
- en
pipeline_tag: text-generation
arxiv:
- https://arxiv.org/abs/2508.06595
library_name: transformers
Model Details
Best Mistral-7B-Instruct-v0.3 checkpoint unlearned using RMU with the Filter-Bio forget set. For more details, please check our paper.
sources
- Base model: Mistral-7B-Instruct-v0.3
- Repository: [https://github.com/xyzhu123/Synthetic_Textbook)
Performance
| WMDP-Bio | tinyMMLU | GSM8k | TriviaQA | |
|---|---|---|---|---|
| Mistral-7B-Instruct-v0.3 | 67.48 | 64.20 | 50.19 | 56.81 |
| Mistral-7B-Instruct-v0.3_RMU_Filter-Bio | 26.39 | 52.54 | 44.04 | 56.51 |
Citation
If you find this useful in your research, please consider citing our paper:
@misc{zhu2025llmunlearningexpertcurated,
title={LLM Unlearning Without an Expert Curated Dataset},
author={Xiaoyuan Zhu and Muru Zhang and Ollie Liu and Robin Jia and Willie Neiswanger},
year={2025},
eprint={2508.06595},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2508.06595},
}