-
Notifications
You must be signed in to change notification settings - Fork 36
Adding MolGAN's basic implementation #13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Some questions about this implementation:
However during training this would give discreet structures which can be hard to backprogate through. I believe in the original implementation they use a softmax function with the option for the user to switch between softmax, soft_gumbel and hard gumbel. During evaluation one can then use torch.max to get a discreet structure.
Let me know if the above makes sense. |
Description
Solves issue: #8
This PR adds support for MolGAN, a GAN-based model for molecular graph generation. It includes the
MolGAN
class, which is a class compatible with theBaseMolecularGenerator
class with configurable architecture. It also usesRewardOracle
, which supports both Neural Network-based reward function (based on theRewardNeuralNetwork
and theNonNeuralNetwork
reward class. However, please note that the neural network RL implementation does require more work. It also allows for future explorations, which can include config classes for defining generators and discriminators in the GAN class. Also allows for the graph implementation, which allows conversion from SMILE data and graphs with graph representation. Note that we do need to activate use_reward for the reward based training to work, otherwise works as a regular GAN. Future PRs can include Hugging Face support and better reward functions.