A newer version of this model is available:
EpistemeAI/metatune-gpt20b-R1
This is a metatune-gpt20b model used prototype for self-improving ai training loop.
- Generates new data for itself,
- Evaluates its performance, and
- Adjusts its own hyperparameters based on improvement metrics.
Use cases:
- genuinely demonstrate scientific and mathematical understanding at a postdoctoral level.
- Topics: Euler–Lagrange equation, vector calculus, statistical mechanics
Guardrails:
- use safety gpt oss 20b openai/gpt-oss-safeguard-20b
Inference examples
Transformers
You can use gpt-oss-120b and gpt-oss-20b with Transformers. If you use the Transformers chat template, it will automatically apply the harmony response format. If you use model.generate directly, you need to apply the harmony format manually using the chat template or use our openai-harmony package.
To get started, install the necessary dependencies to setup your environment:
pip install -U transformers kernels torch
For Google Colab (free/Pro)
!pip install -q --upgrade torch
!pip install -q transformers triton==3.4 kernels
!pip uninstall -q torchvision torchaudio -y
Once, setup you can proceed to run the model by running the snippet below:
from transformers import pipeline
import torch
model_id = "EpistemeAI/metatune-gpt20b"
pipe = pipeline(
"text-generation",
model=model_id,
torch_dtype="auto",
device_map="auto",
)
messages = [
{"role": "user", "content": "Derive the Euler–Lagrange equation from the principle of stationary action.""},
]
outputs = pipe(
messages,
max_new_tokens=3000,
)
print(outputs[0]["generated_text"][-1])
Benchmark[TBD]
Thank you
- OpenAI
- Unsloth
- Google Colab
- Nvidia for A100
Uploaded finetuned model
- Developed by: EpistemeAI
- License: apache-2.0
- Finetuned from model : unsloth/gpt-oss-20b-unsloth-bnb-4bit
This gpt_oss model was trained 2x faster with Unsloth and Huggingface's TRL library.
- Downloads last month
- 66
Model tree for EpistemeAI/metatune-gpt20b
Base model
openai/gpt-oss-20b
Quantized
unsloth/gpt-oss-20b-unsloth-bnb-4bit
