Adding `transformers` as the library tag for better visibility.

#1
by ariG23498 HF Staff - opened

Here I have added the transformers library tag and text-generation as the pipeline tag.

This would help make the model more visible and also populate the "Use this model" code snippet. I have tested the model with the following code snippet:

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "Alibaba-NLP/Tongyi-DeepResearch-30B-A3B"
model = AutoModelForCausalLM.from_pretrained(model_id, dtype="auto", device_map="cuda:0")
tokenizer = AutoTokenizer.from_pretrained(model_id)

messages = [
    {"role": "user", "content": "What do you know about Tensor Parallelism?"},
]

inputs = tokenizer.apply_chat_template(
    messages,
    add_generation_prompt=True,
    tokenize=True,
    return_dict=True,
    return_tensors="pt",
).to(model.device)

with torch.inference_mode():
    generated = model.generate(**inputs, max_new_tokens=1024)

print(tokenizer.batch_decode(generated)[0])
NLPblue changed pull request status to merged

Sign up or log in to comment