Skip to content
This repository was archived by the owner on Aug 18, 2021. It is now read-only.

seq2seq-translation-batched: Bahdanau attention does not work #82

@juditacs

Description

@juditacs

__init__.py fails with AttributeError, max_length does not exist. Fixing this results in a concat error in the Attn class:

    45         elif self.method == 'concat':
---> 46             energy = self.attn(torch.cat((hidden, encoder_output), 1))
     47             energy = self.v.dot(energy)
     48             return energy

RuntimeError: dimension out of range (expected to be in range of [-1, 0], but got 1)

replacing the dimension in line 46 to 0 results in this error:

    44 
     45         elif self.method == 'concat':
---> 46             energy = self.attn(torch.cat((hidden, encoder_output), 0))
     47             energy = self.v.dot(energy)
     48             return energy

RuntimeError: inconsistent tensor sizes at /opt/conda/conda-bld/pytorch_1512386481460/work/torch/lib/THC/generic/THCTensorMath.cu:157

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions