Skip to content

Conversation

stevengogogo
Copy link
Collaborator

@stevengogogo stevengogogo commented Jun 20, 2025

$$ \gamma = [\cos (\textbf{B}x) ~~ \sin(\textbf{B}x)]^T $$

where $B \in \mathbb{R}^{m\times d}$ is sampled from gaussian distribution $N(0,\sigma^2)$

Ref: https://arxiv.org/pdf/2308.08468

@stevengogogo stevengogogo added the enhancement New feature or request label Jun 20, 2025
@stevengogogo stevengogogo marked this pull request as ready for review June 20, 2025 22:36
layer_dim = [dim_inputs] + dim_hidden + [dim_outputs]
# multi-layer MLP
layers = [nn.Linear(layer_dim[i], layer_dim[i + 1]) for i in range(len(layer_dim) - 1)]
self.linear = nn.ModuleList(layers)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I need to find a way to make this part more comprehensible.

Basically, if fourier embedding is enabled with output dimension 2m=hidden. The dimension of m is defined by first hidden layer argument.

Fourier layer: dim_input -> 2m
First linear: hidden -> hidden

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe there is a better way to ddefine this like

FourierLayer(input, output, sigma)

but how to make argument simple is challenging

super().__init__()
self.sigma = sigma
m = half_dim_output # number of frequencies pairs (cos, sin)
B = torch.rand(m, dim_inputs) * sigma
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be Gaussian not rand

@stevengogogo stevengogogo requested a review from Copilot June 27, 2025 17:13
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a Fourier embedding layer to help the model capture high-frequency components by mapping inputs through sinusoidal features.

  • Added a --fourier_embedding_sigma argument to enable/disable Fourier embeddings and set their scale.
  • Implemented FourierEmbedding module sampling fixed frequency transforms.
  • Integrated FourierEmbedding into Level and MultiLevelNN when fourier_embedding_sigma is provided.

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.

File Description
pinn/utils.py Added CLI argument --fourier_embedding_sigma
pinn/pinn_1d.py Imported warnings, defined FourierEmbedding, updated Level & MultiLevelNN to use the new layer, and passed sigma through main
Comments suppressed due to low confidence (2)

pinn/pinn_1d.py:136

  • Docstring refers to half_dim_outputs but the parameter is named half_dim_output. Update the doc to match the argument name.
            half_dim_outputs: The output dimension is 2*half_dim_output.

pinn/pinn_1d.py:130

  • The new FourierEmbedding layer and its integration paths lack unit tests. Consider adding tests for forward pass shapes and behavior with various sigma values.
class FourierEmbedding(nn.Module):

help="Configuration for learning rate scheduler. "
"Follow https://docs.pytorch.org/docs/stable/optim.html for full list of schedulers. "
"The setting is corresponding to `--scheduler` setting.")
parser.add_argument("--fourier_embedding_sigma", type=float, default=-1, help="Sigma for Fourier embedding. Recommended [1,10] ")
Copy link
Preview

Copilot AI Jun 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using a default of -1 means the code will treat a negative sigma as 'provided' and apply Fourier embedding with sigma = -1. Consider defaulting to None and checking if fourier_embedding_sigma is not None and fourier_embedding_sigma > 0.

Suggested change
parser.add_argument("--fourier_embedding_sigma", type=float, default=-1, help="Sigma for Fourier embedding. Recommended [1,10] ")
parser.add_argument("--fourier_embedding_sigma", type=float, default=None, help="Sigma for Fourier embedding. Recommended [1,10] ")

Copilot uses AI. Check for mistakes.

super().__init__()
self.sigma = sigma
m = half_dim_output # number of frequencies pairs (cos, sin)
B = torch.rand(m, dim_inputs) * sigma
Copy link
Preview

Copilot AI Jun 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reference specifies sampling B from a Gaussian distribution, but torch.rand uses a uniform distribution. Replace with torch.randn(m, dim_inputs) * sigma for correct Gaussian sampling.

Suggested change
B = torch.rand(m, dim_inputs) * sigma
B = torch.randn(m, dim_inputs) * sigma

Copilot uses AI. Check for mistakes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant