Synthetic Textbook
Collection
LLM Unlearning Without an Expert Curated Dataset
•
39 items
•
Updated
Best Meta-Llama-3-8B-Instruct checkpoint unlearned using RMU with the Textbook-Cyber forget set. For more details, please check our paper.
| WMDP-Cyber | tinyMMLU | GSM8k | TriviaQA | |
|---|---|---|---|---|
| Llama-3-8B-Instruct | 46.80 | 59.21 | 75.28 | 51.09 |
| Llama-3-8B-Instruct_RMU_Textbook-Cyber | 26.47 | 51.43 | 72.70 | 50.64 |
If you find this useful in your research, please consider citing our paper:
@misc{zhu2025llmunlearningexpertcurated,
title={LLM Unlearning Without an Expert Curated Dataset},
author={Xiaoyuan Zhu and Muru Zhang and Ollie Liu and Robin Jia and Willie Neiswanger},
year={2025},
eprint={2508.06595},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2508.06595},
}