Nanochat
Nanochat is a small language model from Andrej Karpathy, converted to HuggingFace format.
Model Details
- Architecture: GPT-style transformer with RoPE, QK normalization, ReLU², and logits softcap
- Parameters: ~393M
- Hidden Size: 1280
- Layers: 20
- Attention Heads: 10
- Vocabulary: 65536 tokens
- Context Length: 2048 tokens
Usage
With Transformers (PyTorch)
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("<model-path>", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("<model-path>")
prompt = "Once upon a time"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0]))
Converting to MLX
To use with Apple's MLX framework:
mlx_lm.convert --hf-path <model-path> --mlx-path nanochat-mlx --trust-remote-code
mlx_lm.generate --model nanochat-mlx --prompt "Once upon a time"
Citation
Original model by Andrej Karpathy: https://github.com/karpathy/nanochat
- Downloads last month
- 100