Skip to content

Add missing parameters for seq2seq model#8

Open
fuzzythecat wants to merge 1 commit into
KBNLresearch:masterfrom
fuzzythecat:patch-1
Open

Add missing parameters for seq2seq model#8
fuzzythecat wants to merge 1 commit into
KBNLresearch:masterfrom
fuzzythecat:patch-1

Conversation

@fuzzythecat
Copy link
Copy Markdown

initialize_model_seq2seq was missing a positional parameter pred_char, and running lstm_synced.py with seq2seq = True runs into an error.

This PR adds pred_char plus an optional parameter char_embedding_size for enabling the embedding layer in the seq2seq model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant