You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ The big part of this project, meaning the [Multilayer Perceptron (MLP)](https://
10
10
11
11
I then decided to push it even further by adding [Convolutional Neural Networks (CNN)](https://en.wikipedia.org/wiki/Convolutional_neural_network), [Recurrent Neural Networks (RNN)](https://en.wikipedia.org/wiki/Recurrent_neural_network), [Autoencoders](https://en.wikipedia.org/wiki/Autoencoder), [Variational Autoencoders (VAE)](https://en.wikipedia.org/wiki/Variational_autoencoder), [GANs](https://en.wikipedia.org/wiki/Generative_adversarial_network) and [Transformers](https://en.wikipedia.org/wiki/Transformer_(machine_learning_model)).
12
12
13
-
Regarding the Transformers, I just basically reimplement the [Attention is All You Need](https://arxiv.org/abs/1706.03762) paper, but I had some issues with the gradients and the normalization of the attention weights. So, I decided to leave it as it is for now. It theorically works but needs a huge amount of data that can't be trained on a CPU. You can however see what each layers produce and how the attention weights are calculated [here](examples/generation/transformer-text-generation/transformer-debug.ipynb).
13
+
Regarding the Transformers, I just basically reimplement the [Attention is All You Need](https://arxiv.org/abs/1706.03762) paper, but I had some issues with the gradients and the normalization of the attention weights. So, I decided to leave it as it is for now. It theorically works but needs a huge amount of data that can't be trained on a CPU. You can however see what each layers produce and how the attention weights are calculated [here](examples/models-usages/generation/transformer-text-generation/transformer-debug.ipynb).
14
14
15
15
This project will be maintained as long as I have ideas to improve it, and as long as I have time to work on it.
16
16
@@ -40,20 +40,20 @@ pip install neuralnetlib
40
40
41
41
## Basic usage
42
42
43
-
See [this file](examples/classification-regression/mnist_multiclass.ipynb) for a simple example of how to use the library.<br>
44
-
For a more advanced example, see [this file](examples/cnn-classification/cnn_classification_mnist.ipynb) for using CNN.<br>
45
-
You can also check [this file](examples/classification-regression/sentiment_analysis.ipynb) for text classification using RNN.<br>
43
+
See [this file](examples/models-usages/mlp-classification-regression/mnist_multiclass.ipynb) for a simple example of how to use the library.<br>
44
+
For a more advanced example, see [this file](examples/models-usages/cnn-classification/cnn_classification_mnist.ipynb) for using CNN.<br>
45
+
You can also check [this file](examples/models-usages/mlp-classification-regression/sentiment_analysis.ipynb) for text classification using RNN.<br>
46
46
47
47
## Advanced usage
48
48
49
-
See [this file](examples/generation/autoencoder_vae_example.ipynb) for an example of how to use VAE to generate new images.<br>
50
-
And [this file](examples/rnn-text-generation/dinosaur_names_generator.ipynb) for an example of how to generate new dinosaur names.<br>
49
+
See [this file](examples/models-usages/generation/autoencoder_vae_example.ipynb) for an example of how to use VAE to generate new images.<br>
50
+
And [this file](examples/models-usages/rnn-text-generation/dinosaur_names_generator.ipynb) for an example of how to generate new dinosaur names.<br>
51
51
52
52
More examples in [this folder](examples).
53
53
54
54
You are free to tweak the hyperparameters and the network architecture to see how it affects the results.
0 commit comments