Skip to content

Conversation

smsm8898
Copy link

give small flexibility to pad transformer

@joecummings
Copy link
Member

@SM-Jang Thanks for the addition! A couple of notes:

  1. Can you refer to our style guide and make sure the code passes our linting?
  2. Can you add a brief note on the motivation behind this addition?

@smsm8898
Copy link
Author

smsm8898 commented Mar 21, 2023

  1. Can you refer to our style guide and make sure the code passes our linting?
    Okay, I check flake8
    and I follow change the name as begin:bool, default=False
    To verify it, i modify the unittest and pass it all
    (test/torchtext_unittest/test_transfoms.py)

  2. Can you add a brief note on the motivation behind this addition?
    When I work for modeling timeserise, I have to pad on begin or end.
    The old PadTransform only give left pad... So I have to use torch.nn.functional.pad()

ex)
...
self.query_transformer = Sequential(
# Truncate(10),
VocabTransform(query_vocab),
ToTensor(),
)
...
x = query_transformer(x)
x = torch.nn.functional.pad(x, (0, pad_amount), value=self.pad_value)
...

That's why

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants