Milo Bitcoin GPT-OSS-20B LoRA v1
A professional Bitcoin quantitative analysis model fine-tuned on GPT-OSS-20B
This is a LoRA (Low-Rank Adaptation) adapter for unsloth/gpt-oss-20b-unsloth-bnb-4bit, specifically fine-tuned for professional Bitcoin market analysis and trading signal generation.
Model Description
Milo Bitcoin is an AI-powered quantitative analyst that provides:
- Professional Trading Analysis: Multi-factor technical analysis with precise signals
- Structured Decision Output: JSON-formatted BUY/SELL/HOLD recommendations
- Quantitative Intelligence: Technical indicators, trend analysis, momentum signals
- Risk Assessment: Stop-loss, take-profit, and confidence scores
Key Features
- 🎯 Specialized in Bitcoin market analysis
- 📊 Structured JSON output format
- 🔢 Multi-task: price forecasting + classification + risk assessment
- 📈 Professional trading-ready signals
- ⚡ Consistent methodology trained on 7,341 samples
Training Details
Training Data
- Training Samples: 6,239 professional Bitcoin analysis examples
- Validation Samples: 734 samples
- Test Samples: 368 samples
- Total Samples: 7,341 samples
- Data Quality: 99%+ validated professional samples
- Data Mix: 90% Bitcoin analysis + 7% math reasoning + 3% logic reasoning
Training Configuration
- Base Model: GPT-OSS-20B (21B parameters, 3.6B active)
- Method: LoRA (Low-Rank Adaptation)
- LoRA Rank: 64
- LoRA Alpha: 128
- Target Modules: q_proj, k_proj, v_proj, o_proj
- Training Epochs: 3
- Batch Size: 4 (effective batch size: 32 with gradient accumulation)
- Learning Rate: 2e-4
- Training Time: 1.65 hours on RTX 5090 (32GB VRAM)
Training Results
- Final Training Loss: 1.2539
- Final Validation Loss: 1.2931
- Training Speed: 3.145 samples/second
- Model Size: 122MB LoRA weights (vs base model ~20GB, 99.4% compression)
- Convergence: Stable loss reduction with no overfitting
Usage
Installation
pip install transformers peft torch unsloth
Load Model
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
# Load base model
base_model = AutoModelForCausalLM.from_pretrained(
"unsloth/gpt-oss-20b-unsloth-bnb-4bit",
device_map="auto",
trust_remote_code=True
)
# Load LoRA adapter
model = PeftModel.from_pretrained(
base_model,
"HugMilo/milo-bitcoin-gpt-oss-20b-lora-v1"
)
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained(
"HugMilo/milo-bitcoin-gpt-oss-20b-lora-v1"
)
Generate Analysis
# Prepare prompt
prompt = """Analyze the current Bitcoin market conditions with the following data:
- Current Price: $109,453
- 24h Change: -5.35%
- RSI(14): 31.2
- Volume: $22.63B
Provide professional trading analysis with structured output."""
# Generate response
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(
**inputs,
max_new_tokens=512,
temperature=0.7,
do_sample=True
)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
Expected Output Format
{
"action": "HOLD",
"confidence": 72,
"current_price": 109453.00,
"stop_loss": 105200.00,
"take_profit": 116800.00,
"forecast_10d": [109800, 111200, 112500, 114100, 115600, 116200, 115800, 116800, 118200, 117900],
"analysis": "BTC consolidating around $109k level after -5.35% weekly decline. RSI oversold at 31, testing key support. Market cap dominance 56.5% suggests institutional confidence remains.",
"risk_score": 0.31,
"technical_indicators": {
"rsi_14": 31.2,
"sma_20": 112500,
"volume_24h": "22.63B USD"
}
}
Intended Use
Primary Users
- Quantitative traders seeking AI-powered analysis signals
- Crypto fund managers requiring structured analysis frameworks
- Professional investors for data-driven portfolio management
- FinTech developers building Bitcoin analysis APIs
Use Cases
- Systematic trading signal generation
- Risk management and position sizing
- Research and backtesting
- API integration for trading systems
Limitations and Disclaimers
⚠️ For Professional Traders Only
- Model predictions are based on historical data patterns (training data from Bitcoin market history)
- Past performance does not guarantee future results
- This is a tool for professional analysis, not financial advice
- Users are responsible for their own trading decisions and risk management
- Always combine with your own analysis and risk management framework
- Regulatory compliance is the user's responsibility
Performance Metrics
- Training Efficiency: 7x faster than expected (1.65h vs 12-15h projected)
- JSON Format Consistency: 100% structured output during training
- Inference Speed: <2 seconds per analysis on RTX 5090
- Memory Requirements: ~8-12GB VRAM for inference (4-bit quantization)
Technical Specifications
- Framework: Transformers 4.56.2, PEFT 0.17.1, TRL 0.23.0
- PyTorch: 2.8.0+cu128
- Hardware Used: NVIDIA RTX 5090 (32GB VRAM)
- Quantization: 4-bit via bitsandbytes
- Gradient Checkpointing: Unsloth optimized
Model Card Authors
Norton Gu | University of Rochester '25
- GitHub: @futurespyhi
- LinkedIn: Norton Gu
- Project: Milo_Bitcoin
Citation
If you use this model in your research or applications, please cite:
@misc{gu2025milobitcoin,
author = {Norton Gu},
title = {Milo Bitcoin: AI-Powered Bitcoin Quantitative Analysis Assistant},
year = {2025},
publisher = {HuggingFace},
howpublished = {\url{https://huggingface.co/HugMilo/milo-bitcoin-gpt-oss-20b-lora-v1}},
}
License
MIT License - Free for educational and commercial use
Acknowledgments
- Base model: unsloth/gpt-oss-20b-unsloth-bnb-4bit
- Training framework: Unsloth
- Fine-tuning: TRL and PEFT
Building the future of AI-powered crypto analysis, one meow at a time 🐾₿
- Downloads last month
- 10
Model tree for HugMilo/milo-bitcoin-gpt-oss-20b-lora-v1
Base model
openai/gpt-oss-20b
Quantized
unsloth/gpt-oss-20b-unsloth-bnb-4bit