YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

eleuther-pythia410m-hh-dpo - bnb 8bits

Original model description:

language: - en tags: - pytorch - causal-lm - pythia license: apache-2.0 datasets: - Anthropic/hh-rlhf

Pythia-410m supervised finetuned with Anthropic-hh-rlhf dataset for 1 epoch (sft-model), before DPO (paper) with same dataset for 1 epoch.

wandb log

Benchmark evaluations included in repo done using lm-evaluation-harness.

See Pythia-410m for original model details (paper).

Downloads last month
-
Safetensors
Model size
0.4B params
Tensor type
F32
F16
I8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support