-
Notifications
You must be signed in to change notification settings - Fork 1
Description
Issue Description:
There is currently a mismatch between the intended and actual usage of input and hidden channels in the ResGatedGraphConvNetBase
class implementation.
🧩 Problem Description
-
Why are input channels needed if we have
n_atom_properties
?
In our current setup,n_atom_properties
is effectively acting as the number of input channels for the first convolution layer. This raises the question of whether we need an explicitin_length
parameter at all if it’s not used directly. The only place where it appears meaningful is in the code, where it is incorrectly applied to multiple convolution layers. -
Incorrect channel assignment in previous version:
In current implementation, all convolution layers usedin_length
as the input channel size, and only the final layer transitioned tohidden_length
. This does not follow conventional practice as used in PyG’sBasicGNN
, the correct pattern should be:- First layer:
input_channels
(n_atom_properties
in our case) →hidden_channels
- Hidden layers:
hidden_channels → hidden_channels
- (Optionally) Final layer:
hidden_channels → output_channels
or similar
- First layer:
-
Off-by-one error in number of convolution layers:
If the user specifiesn_conv_layers=3
, the previous code creates 4 layers due to an extra one outside the loop (n + 1
instead ofn
). This is misleading and could result in unintended architectural depth.
✅ Proposed Fix
- Use
n_atom_properties
as the number of input channels. - Remove
in_length
if it's redundant. - Ensure the first convolution layer maps from
n_atom_properties → hidden_length
, and all subsequent layers usehidden_length → hidden_length
. - Fix layer creation logic to respect
n_conv_layers
accurately (i.e., create exactlyn
layers).