-
Notifications
You must be signed in to change notification settings - Fork 0
Add Fourier Embedding layer #35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
layer_dim = [dim_inputs] + dim_hidden + [dim_outputs] | ||
# multi-layer MLP | ||
layers = [nn.Linear(layer_dim[i], layer_dim[i + 1]) for i in range(len(layer_dim) - 1)] | ||
self.linear = nn.ModuleList(layers) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I need to find a way to make this part more comprehensible.
Basically, if fourier embedding is enabled with output dimension 2m=hidden
. The dimension of m is defined by first hidden layer argument.
Fourier layer: dim_input -> 2m
First linear: hidden -> hidden
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe there is a better way to ddefine this like
FourierLayer(input, output, sigma)
but how to make argument simple is challenging
super().__init__() | ||
self.sigma = sigma | ||
m = half_dim_output # number of frequencies pairs (cos, sin) | ||
B = torch.rand(m, dim_inputs) * sigma |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be Gaussian not rand
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces a Fourier embedding layer to help the model capture high-frequency components by mapping inputs through sinusoidal features.
- Added a
--fourier_embedding_sigma
argument to enable/disable Fourier embeddings and set their scale. - Implemented
FourierEmbedding
module sampling fixed frequency transforms. - Integrated FourierEmbedding into
Level
andMultiLevelNN
whenfourier_embedding_sigma
is provided.
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.
File | Description |
---|---|
pinn/utils.py | Added CLI argument --fourier_embedding_sigma |
pinn/pinn_1d.py | Imported warnings, defined FourierEmbedding , updated Level & MultiLevelNN to use the new layer, and passed sigma through main |
Comments suppressed due to low confidence (2)
pinn/pinn_1d.py:136
- Docstring refers to
half_dim_outputs
but the parameter is namedhalf_dim_output
. Update the doc to match the argument name.
half_dim_outputs: The output dimension is 2*half_dim_output.
pinn/pinn_1d.py:130
- The new
FourierEmbedding
layer and its integration paths lack unit tests. Consider adding tests for forward pass shapes and behavior with various sigma values.
class FourierEmbedding(nn.Module):
help="Configuration for learning rate scheduler. " | ||
"Follow https://docs.pytorch.org/docs/stable/optim.html for full list of schedulers. " | ||
"The setting is corresponding to `--scheduler` setting.") | ||
parser.add_argument("--fourier_embedding_sigma", type=float, default=-1, help="Sigma for Fourier embedding. Recommended [1,10] ") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using a default of -1 means the code will treat a negative sigma as 'provided' and apply Fourier embedding with sigma = -1. Consider defaulting to None and checking if fourier_embedding_sigma is not None and fourier_embedding_sigma > 0
.
parser.add_argument("--fourier_embedding_sigma", type=float, default=-1, help="Sigma for Fourier embedding. Recommended [1,10] ") | |
parser.add_argument("--fourier_embedding_sigma", type=float, default=None, help="Sigma for Fourier embedding. Recommended [1,10] ") |
Copilot uses AI. Check for mistakes.
super().__init__() | ||
self.sigma = sigma | ||
m = half_dim_output # number of frequencies pairs (cos, sin) | ||
B = torch.rand(m, dim_inputs) * sigma |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reference specifies sampling B from a Gaussian distribution, but torch.rand
uses a uniform distribution. Replace with torch.randn(m, dim_inputs) * sigma
for correct Gaussian sampling.
B = torch.rand(m, dim_inputs) * sigma | |
B = torch.randn(m, dim_inputs) * sigma |
Copilot uses AI. Check for mistakes.
where$B \in \mathbb{R}^{m\times d}$ is sampled from gaussian distribution $N(0,\sigma^2)$
Ref: https://arxiv.org/pdf/2308.08468