Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Code to use "facebook/blenderbot-400M-distill" is not working from "use this model" section #1722

Copy link
Copy link
@anupamsaraswat

Description

@anupamsaraswat
Issue body actions

Steps to reproduce the error

!pip install -U transformers

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
datasets 3.6.0 requires fsspec[http]<=2025.3.0,>=2023.1.0, but you have fsspec 2025.5.1 which is incompatible.
Successfully installed huggingface-hub-0.34.4 tokenizers-0.22.0 transformers-4.56.0

Load model directly

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("facebook/blenderbot-400M-distill")
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/blenderbot-400M-distill")
messages = [
    {"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
	messages,
	add_generation_prompt=True,
	tokenize=True,
	return_dict=True,
	return_tensors="pt",
).to(model.device)

outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))

Here is the error I faced


ValueError Traceback (most recent call last)
/tmp/ipykernel_36/950536689.py in <cell line: 0>()
7 {"role": "user", "content": "Who are you?"},
8 ]
----> 9 inputs = tokenizer.apply_chat_template(
10 messages,
11 add_generation_prompt=True,

/usr/local/lib/python3.11/dist-packages/transformers/tokenization_utils_base.py in apply_chat_template(self, conversation, tools, documents, chat_template, add_generation_prompt, continue_final_message, tokenize, padding, truncation, max_length, return_tensors, return_dict, return_assistant_tokens_mask, tokenizer_kwargs, **kwargs)
1618 tokenizer_kwargs = {}
1619
-> 1620 chat_template = self.get_chat_template(chat_template, tools)
1621
1622 if isinstance(conversation, (list, tuple)) and (

/usr/local/lib/python3.11/dist-packages/transformers/tokenization_utils_base.py in get_chat_template(self, chat_template, tools)
1796 chat_template = self.chat_template
1797 else:
-> 1798 raise ValueError(
1799 "Cannot use chat template functions because tokenizer.chat_template is not set and no template "
1800 "argument was passed! For information about writing templates and setting the "

ValueError: Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.