Text Classification
Transformers
Safetensors
English
HHEMv2Config
custom_code

the max token length for HHEMv2? Token indices error and killed.

#23
by hustzjl - opened

Hi HHEM authors,
what's the max token length for HHEMv2?
I met a bug:
https://huggingface.co/vectara/hallucination_evaluation_model. Running HHEM evaluation on 1830 pairs...
You are using a model of type HHEMv2Config to instantiate a model of type HHEMv2. This is not supported for all configurations of models and can yield errors.
Token indices sequence length is longer than the specified maximum sequence length for this model (87731 > 512). Running this sequence through the model will result in indexing errors
Killed

Vectara org

Hi @hustzjl , thanks for your question. As mentioned in the model card that you linked in your post, HHEMv2 has unlimited context length. The message that you observed is a harmless warning message. You can simply ignore it.

I'm guessing you manually killed your code after you saw this message. If you did not do this and the code stopped executing on its own, I would love to help you look more into the source of the issue. Otherwise, the code should still execute and work as intended.

Let us know if you run into any other issues.

Sign up or log in to comment