vllm?
#25 opened 26 days ago
by
ansuglutor
New bigger Model?
#24 opened about 1 month ago
by
digitalesingulary
can you provide onnx model weights?
#23 opened about 2 months ago
by
ningpp
Poor results, what do i do wrong?
1
#22 opened 2 months ago
by
hanshupe
Limit for max_new_tokens
#21 opened 2 months ago
by
hanshupe
How can i install transformers>=4.56, is that stable version?
2
#20 opened 2 months ago
by
duynvh2k
Replace **dtype** with **torch_dtype** in Model Loading
1
#19 opened 2 months ago
by
DaVinciCode
When will Kosmos-2.5 be integrated into Transformers?
π
1
7
#16 opened 3 months ago
by
digitalesingulary
what is the possible max new token?
#15 opened 11 months ago
by
PF94
Update pipeline tag
#14 opened 12 months ago
by
nielsr
Guide to train or fine-tune Kosmos-2.5 for other language
5
#13 opened about 1 year ago
by
GardensOfBabylon29
ImportError: cannot import name 'Kosmos2_5ForConditionalGeneration' from 'transformers'
1
#12 opened about 1 year ago
by
ksarao
Table extraction
1
#11 opened about 1 year ago
by
Jkppp
Upload receipt_00008.png
#10 opened about 1 year ago
by
callsys
Batch inferencing bug: All outputs in the same batch has the same prediction, even for different images
#8 opened about 1 year ago
by
weihf
Error: 'Kosmos2_5ForConditionalGeneration' from 'transformers' when using the latest transformer version
β
11
12
#7 opened over 1 year ago
by
Nidhin117
Kosmos-2.5 - Containerized & made available over an API
π
6
2
#6 opened over 1 year ago
by
AbheekG
Batch pred slower than single image inference on 1x4090?
#5 opened over 1 year ago
by
04RR
Add example code / notebook
π
6
1
#4 opened over 1 year ago
by
EwoutH
will you be releasing a transformers version?
π
3
2
#3 opened over 1 year ago
by
jshenoy
No config.json
1
#2 opened over 1 year ago
by
abalogh
Apply for community grant: Academic project
π
3
1
#1 opened over 1 year ago
by
estyle