--- library_name: peft license: apache-2.0 base_model: HuggingFaceTB/SmolLM2-135M-Instruct tags: - generated_from_trainer model-index: - name: SmolLM2-135M-Instruct-Arabic-test results: [] --- # SmolLM2-135M-Instruct-Arabic-test This model is a fine-tuned version of [HuggingFaceTB/SmolLM2-135M-Instruct](https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.2311 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.PAGED_ADAMW with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: constant - lr_scheduler_warmup_ratio: 0.03 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 1.6658 | 0.0889 | 100 | 1.6589 | | 1.6393 | 0.1778 | 200 | 1.5819 | | 1.5377 | 0.2667 | 300 | 1.5419 | | 1.5329 | 0.3556 | 400 | 1.5107 | | 1.4348 | 0.4444 | 500 | 1.4799 | | 1.4884 | 0.5333 | 600 | 1.4576 | | 1.3968 | 0.6222 | 700 | 1.4366 | | 1.432 | 0.7111 | 800 | 1.4194 | | 1.4154 | 0.8 | 900 | 1.4033 | | 1.3925 | 0.8889 | 1000 | 1.3879 | | 1.4315 | 0.9778 | 1100 | 1.3771 | | 1.3474 | 1.0667 | 1200 | 1.3649 | | 1.3419 | 1.1556 | 1300 | 1.3524 | | 1.358 | 1.2444 | 1400 | 1.3450 | | 1.3034 | 1.3333 | 1500 | 1.3356 | | 1.2675 | 1.4222 | 1600 | 1.3258 | | 1.3282 | 1.5111 | 1700 | 1.3186 | | 1.2946 | 1.6 | 1800 | 1.3098 | | 1.2936 | 1.6889 | 1900 | 1.3027 | | 1.2275 | 1.7778 | 2000 | 1.2956 | | 1.2438 | 1.8667 | 2100 | 1.2915 | | 1.2819 | 1.9556 | 2200 | 1.2826 | | 1.2551 | 2.0444 | 2300 | 1.2767 | | 1.2067 | 2.1333 | 2400 | 1.2799 | | 1.2313 | 2.2222 | 2500 | 1.2687 | | 1.2186 | 2.3111 | 2600 | 1.2633 | | 1.2393 | 2.4 | 2700 | 1.2587 | | 1.1746 | 2.4889 | 2800 | 1.2507 | | 1.2061 | 2.5778 | 2900 | 1.2494 | | 1.2453 | 2.6667 | 3000 | 1.2434 | | 1.1822 | 2.7556 | 3100 | 1.2414 | | 1.1563 | 2.8444 | 3200 | 1.2335 | | 1.173 | 2.9333 | 3300 | 1.2311 | ### Framework versions - PEFT 0.14.0 - Transformers 4.51.1 - Pytorch 2.5.1+cu124 - Datasets 3.5.0 - Tokenizers 0.21.0