Skip to content

Commit 203ced8

Browse files
committed
docs(readme): update hyperlinks
1 parent ad06a2d commit 203ced8

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

README.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ The big part of this project, meaning the [Multilayer Perceptron (MLP)](https://
1010

1111
I then decided to push it even further by adding [Convolutional Neural Networks (CNN)](https://en.wikipedia.org/wiki/Convolutional_neural_network), [Recurrent Neural Networks (RNN)](https://en.wikipedia.org/wiki/Recurrent_neural_network), [Autoencoders](https://en.wikipedia.org/wiki/Autoencoder), [Variational Autoencoders (VAE)](https://en.wikipedia.org/wiki/Variational_autoencoder), [GANs](https://en.wikipedia.org/wiki/Generative_adversarial_network) and [Transformers](https://en.wikipedia.org/wiki/Transformer_(machine_learning_model)).
1212

13-
Regarding the Transformers, I just basically reimplement the [Attention is All You Need](https://arxiv.org/abs/1706.03762) paper, but I had some issues with the gradients and the normalization of the attention weights. So, I decided to leave it as it is for now. It theorically works but needs a huge amount of data that can't be trained on a CPU. You can however see what each layers produce and how the attention weights are calculated [here](examples/generation/transformer-text-generation/transformer-debug.ipynb).
13+
Regarding the Transformers, I just basically reimplement the [Attention is All You Need](https://arxiv.org/abs/1706.03762) paper, but I had some issues with the gradients and the normalization of the attention weights. So, I decided to leave it as it is for now. It theorically works but needs a huge amount of data that can't be trained on a CPU. You can however see what each layers produce and how the attention weights are calculated [here](examples/models-usages/generation/transformer-text-generation/transformer-debug.ipynb).
1414

1515
This project will be maintained as long as I have ideas to improve it, and as long as I have time to work on it.
1616

@@ -40,20 +40,20 @@ pip install neuralnetlib
4040

4141
## Basic usage
4242

43-
See [this file](examples/classification-regression/mnist_multiclass.ipynb) for a simple example of how to use the library.<br>
44-
For a more advanced example, see [this file](examples/cnn-classification/cnn_classification_mnist.ipynb) for using CNN.<br>
45-
You can also check [this file](examples/classification-regression/sentiment_analysis.ipynb) for text classification using RNN.<br>
43+
See [this file](examples/models-usages/mlp-classification-regression/mnist_multiclass.ipynb) for a simple example of how to use the library.<br>
44+
For a more advanced example, see [this file](examples/models-usages/cnn-classification/cnn_classification_mnist.ipynb) for using CNN.<br>
45+
You can also check [this file](examples/models-usages/mlp-classification-regression/sentiment_analysis.ipynb) for text classification using RNN.<br>
4646

4747
## Advanced usage
4848

49-
See [this file](examples/generation/autoencoder_vae_example.ipynb) for an example of how to use VAE to generate new images.<br>
50-
And [this file](examples/rnn-text-generation/dinosaur_names_generator.ipynb) for an example of how to generate new dinosaur names.<br>
49+
See [this file](examples/models-usages/generation/autoencoder_vae_example.ipynb) for an example of how to use VAE to generate new images.<br>
50+
And [this file](examples/models-usages/rnn-text-generation/dinosaur_names_generator.ipynb) for an example of how to generate new dinosaur names.<br>
5151

5252
More examples in [this folder](examples).
5353

5454
You are free to tweak the hyperparameters and the network architecture to see how it affects the results.
5555

56-
## 🚀 Quick examples (more [here](examples/))
56+
## 🚀 Quick examples (more [here](examples/models-usages/))
5757

5858
### Binary Classification
5959

0 commit comments

Comments
 (0)