Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

聊天崩溃  #13

Copy link
Copy link
@Chenjm08

Description

@Chenjm08
Issue body actions

运行以下命令,出现崩溃,该怎么解决?

python3 main.py chat
Doragd > 你在 干嘛

崩溃信息:

Traceback (most recent call last):
  File "main.py", line 38, in <module>
    fire.Fire()
  File "/home/chenjm/.local/lib/python3.8/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/home/chenjm/.local/lib/python3.8/site-packages/fire/core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/home/chenjm/.local/lib/python3.8/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "main.py", line 28, in chat
    output_words = train_eval.output_answer(input_sentence, searcher, sos, eos, unknown, opt, word2ix, ix2word)
  File "/home/chenjm/chat-ai/Chinese-Chatbot-PyTorch-Implementation/train_eval.py", line 291, in output_answer
    tokens = generate(input_seq, searcher, sos, eos, opt)
  File "/home/chenjm/chat-ai/Chinese-Chatbot-PyTorch-Implementation/train_eval.py", line 202, in generate
    tokens, scores = searcher(sos, eos, input_batch, input_lengths, opt.max_generate_length, opt.device)
  File "/home/chenjm/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/chenjm/chat-ai/Chinese-Chatbot-PyTorch-Implementation/utils/greedysearch.py", line 17, in forward
    encoder_outputs, encoder_hidden = self.encoder(input_seq, input_length)
  File "/home/chenjm/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/chenjm/chat-ai/Chinese-Chatbot-PyTorch-Implementation/model.py", line 51, in forward
    packed = torch.nn.utils.rnn.pack_padded_sequence(embedded, input_lengths)
  File "/home/chenjm/.local/lib/python3.8/site-packages/torch/nn/utils/rnn.py", line 262, in pack_padded_sequence
    _VF._pack_padded_sequence(input, lengths, batch_first)
RuntimeError: 'lengths' argument should be a 1D CPU int64 tensor, but got 1D cuda:0 Long tensor

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.