Update README.md
Browse files
README.md
CHANGED
|
@@ -121,7 +121,7 @@ We evaluate the INT4 and FP8 quantized models using several datasets. The FP8 qu
|
|
| 121 |
|
| 122 |
This code repository is licensed under [the MIT License](https://github.com/inclusionAI/Ring-V2/blob/master/LICENSE).
|
| 123 |
|
| 124 |
-
|
| 125 |
```shell
|
| 126 |
@misc{lingteam2025attentionmattersefficienthybrid,
|
| 127 |
title={Every Attention Matters: An Efficient Hybrid Architecture for Long-Context Reasoning},
|
|
|
|
| 121 |
|
| 122 |
This code repository is licensed under [the MIT License](https://github.com/inclusionAI/Ring-V2/blob/master/LICENSE).
|
| 123 |
|
| 124 |
+
## Citation
|
| 125 |
```shell
|
| 126 |
@misc{lingteam2025attentionmattersefficienthybrid,
|
| 127 |
title={Every Attention Matters: An Efficient Hybrid Architecture for Long-Context Reasoning},
|