Skip to content

One question about the decoder of vae #17

@Hsintien-Ng

Description

@Hsintien-Ng

File: https://github.com/ChunyuanLI/Optimus/blob/master/code/examples/big_ae/modules/vae.py

code in line 188, 133, 143: outputs = self.decoder(input_ids=labels, past=latent_z, labels=labels, label_ignore=self.pad_token_id)
this line takes labels as the input_ids of decoder. I wonder know if it is an error.
Should it be input_ids=inputs? Since there exits -1 in labels, it occurs an error in line 460 (inputs_embeds = self.wte(input_ids)) in modeling_gpt2.py in pytorch_transformers.

Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions