File size: 7,022 Bytes
406cec2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6e4e7cd
406cec2
 
34c5e55
406cec2
34c5e55
 
406cec2
34c5e55
 
406cec2
6e4e7cd
406cec2
34c5e55
406cec2
34c5e55
 
 
 
406cec2
34c5e55
406cec2
34c5e55
406cec2
34c5e55
 
 
 
 
 
 
 
 
 
406cec2
c438133
 
 
 
 
 
db0dfbf
c438133
 
 
 
 
 
 
406cec2
34c5e55
406cec2
34c5e55
406cec2
34c5e55
406cec2
34c5e55
 
406cec2
34c5e55
 
406cec2
34c5e55
 
 
 
406cec2
34c5e55
 
 
 
 
406cec2
34c5e55
406cec2
34c5e55
406cec2
34c5e55
 
406cec2
34c5e55
 
 
 
406cec2
34c5e55
 
406cec2
34c5e55
 
 
 
406cec2
34c5e55
406cec2
34c5e55
406cec2
34c5e55
 
406cec2
34c5e55
406cec2
34c5e55
406cec2
34c5e55
406cec2
34c5e55
406cec2
 
 
34c5e55
406cec2
 
34c5e55
 
 
 
 
 
406cec2
 
 
34c5e55
406cec2
34c5e55
406cec2
34c5e55
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
---
language:
- en
tags:
- sentence-transformers
- cross-encoder
- reranker
- generated_from_trainer
- dataset_size:942069
- loss:PrecomputedDistillationLoss
base_model: jhu-clsp/ettin-encoder-68m
datasets:
- dleemiller/all-nli-distill
pipeline_tag: text-classification
library_name: sentence-transformers
metrics:
- f1_macro
- f1_micro
- f1_weighted
model-index:
- name: CrossEncoder based on jhu-clsp/ettin-encoder-68m
  results:
  - task:
      type: cross-encoder-classification
      name: Cross Encoder Classification
    dataset:
      name: AllNLI dev
      type: AllNLI-dev
    metrics:
    - type: f1_macro
      value: 0.8974451466908199
      name: F1 Macro
    - type: f1_micro
      value: 0.8976446049753268
      name: F1 Micro
    - type: f1_weighted
      value: 0.8979023442463457
      name: F1 Weighted
  - task:
      type: cross-encoder-classification
      name: Cross Encoder Classification
    dataset:
      name: AllNLI test
      type: AllNLI-test
    metrics:
    - type: f1_macro
      value: 0.8959668105702971
      name: F1 Macro
    - type: f1_micro
      value: 0.8961640211640212
      name: F1 Micro
    - type: f1_weighted
      value: 0.8964174910602712
      name: F1 Weighted
license: mit
---

# EttinX Cross-Encoder: Natural Language Inference (NLI)

This cross encoder performs sequence classification for contradiction/neutral/entailment labels. This has
drop-in compatibility with comparable sentence transformers cross encoders.

To train this model, I added teacher logits to the all-nli dataset `dleemiller/all-nli-distill` from the
`dleemiller/ModernCE-large-nli` model. This significantly improves performance above standard training.

This 68m architecture is based on ModernBERT and is an excellent candidate for lightweight **CPU inference**.

---

## Features
- **High performing:** Achieves **87.98%** and **88.67%** (Micro F1) on MNLI mismatched and SNLI test.
- **Efficient architecture:** Based on the Ettin-68m encoder design (68M parameters), offering faster inference speeds.
- **Extended context length:** Processes sequences up to 8192 tokens, great for LLM output evals.

---

## Performance

| Model                     | MNLI Mismatched   | SNLI Test    | Context Length | # Parameters |
|---------------------------|-------------------|--------------|----------------|----------------|
| [dleemiller/ModernCE-large-nli](https://huggingface.co/dleemiller/ModernCE-large-nli)      | **0.9202**        | 0.9110       | 8192           | 395M  |
| [dleemiller/ModernCE-base-nli](https://huggingface.co/dleemiller/ModernCE-base-nli)       | 0.9034            | 0.9025       | 8192           | 149M  |
| [cross-encoder/nli-deberta-v3-large](https://huggingface.co/cross-encoder/nli-deberta-v3-large)    | 0.9049            | 0.9220       | 512            | 435M  | 
| [cross-encoder/nli-deberta-v3-base](https://huggingface.co/cross-encoder/nli-deberta-v3-base)      | 0.9004            | 0.9234       | 512            | 184M  |
| [dleemiller/EttinX-nli-s](https://huggingface.co/dleemiller/EttinX-nli-s)           | 0.8798            | 0.8967       | 8192           | 68M   |
| [cross-encoder/nli-distilroberta-base](cross-encoder/nli-distilroberta-base) | 0.8398          | 0.8838       | 512            | 82M   |
| [dleemiller/EttinX-nli-xs](https://huggingface.co/dleemiller/EttinX-nli-xs)           | 0.8380            | 0.8820       | 8192           | 32M   |
| [dleemiller/EttinX-nli-xxs](https://huggingface.co/dleemiller/EttinX-nli-xxs)          | 0.8047            | 0.8695       | 8192           | 17M   |

## Extended NLI Evaluation Results

F1-Micro scores (equivalent to accuracy) for each dataset.

| Model | finecat | mnli | mnli_mismatched | snli | anli_r1 | anli_r2 | anli_r3 | wanli | lingnli |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| [dleemiller/finecat-nli-l](https://huggingface.co/dleemiller/finecat-nli-l) | **0.8152** | **0.9088** | <u>0.9217</u> | <u>0.9259</u>   | **0.7400** | **0.5230** | **0.5150** | **0.7424** | **0.8689** |
| [tasksource/ModernBERT-large-nli](https://huggingface.co/tasksource/ModernBERT-large-nli) | 0.7959     | 0.8983     | **0.9229** | 0.9188     | <u>0.7260</u>   | <u>0.5110</u>   | </u>0.4925</u> | <u>0.6978</u> | 0.8504 |
| [dleemiller/ModernCE-large-nli](https://huggingface.co/dleemiller/ModernCE-large-nli)   | 0.7811     | **0.9088** | 0.9205     | **0.9273** | 0.6630     | 0.4860     | 0.4408 | 0.6576 | <u>0.8566</u> |
| [tasksource/ModernBERT-base-nli](https://huggingface.co/tasksource/ModernBERT-large-nli)  | 0.7595 | 0.8685 | 0.8979 | 0.8915 | 0.6300 | 0.4820 | 0.4192 | 0.6632 | 0.8118 |
| [dleemiller/ModernCE-base-nli](https://huggingface.co/dleemiller/ModernCE-base-nli)    | 0.7533 | 0.8923 | 0.9035 | 0.9187 | 0.5240 | 0.3950 | 0.3333 | 0.6464 | 0.8282 |
| [dleemiller/EttinX-nli-s](https://huggingface.co/dleemiller/EttinX-nli-s)   | 0.7251 | 0.8765 | 0.8798 | 0.9128 | 0.3360 | 0.2790 | 0.3083 | 0.6234 | 0.8012 |
| [dleemiller/EttinX-nli-xs](https://huggingface.co/dleemiller/EttinX-nli-xs)  | 0.7013 | 0.8376 | 0.8380 | 0.8979 | 0.2780 | 0.2840 | 0.2800 | 0.5838 | 0.7521 |
| [dleemiller/EttinX-nli-xxs](https://huggingface.co/dleemiller/EttinX-nli-xxs) | 0.6842 | 0.7988 | 0.8047 | 0.8851 | 0.2590 | 0.3060 | 0.2992 | 0.5426 | 0.7018 |

---

## Usage

To use EttinX for NLI tasks, you can load the model with the Hugging Face `sentence-transformers` library:

```python
from sentence_transformers import CrossEncoder

# Load EttinX model
model = CrossEncoder("dleemiller/EttinX-nli-s")

scores = model.predict([
    ('A man is eating pizza', 'A man eats something'),
    ('A black race car starts up in front of a crowd of people.', 'A man is driving down a lonely road.')
])

# Convert scores to labels
label_mapping = ['contradiction', 'entailment', 'neutral']
labels = [label_mapping[score_max] for score_max in scores.argmax(axis=1)]
# ['entailment', 'contradiction']
```

---

## Training Details

### Pretraining
We initialize the `` weights.

Details:
- Batch size: 256
- Learning rate: 1e-4
- **Attention Dropout:** attention dropout 0.1

### Fine-Tuning
Fine-tuning was performed on the `dleemiller/all-nli-distill` dataset.

### Validation Results
The model achieved the following test set micro f1 performance after fine-tuning:
- **MNLI Unmatched:** 0.8798
- **SNLI:** 0.8967

---

## Model Card

- **Architecture:** Ettin-encoder-68m
- **Fine-Tuning Data:** `dleemiller/all-nli-distill`

---

## Thank You

Thanks to the Johns Hopkins team for providing the ModernBERT models, and the Sentence Transformers team for their leadership in transformer encoder models.

---

## Citation

If you use this model in your research, please cite:

```bibtex
@misc{moderncenli2025,
  author = {Miller, D. Lee},
  title = {EttinX NLI: An NLI cross encoder model},
  year = {2025},
  publisher = {Hugging Face Hub},
  url = {https://huggingface.co/dleemiller/EttinX-nli-xxs},
}
```

---

## License

This model is licensed under the [MIT License](LICENSE).