Skip to content

Releases: marcpinet/neuralnetlib

neuralnetlib 4.0.7

06 Dec 21:07
d4065bf

Choose a tag to compare

  • fix(LSTM): huge improvements in gradient flow
  • feat(gradient_norm): better batch handling
  • fix(Callbacks): now fully working generically
  • fix(metrics): val_loss
  • fix(Embedding): bias in output
  • fix(display): val_*
  • feat(metric): add pearsonr
  • feat(metric): add kurtosis
  • feat(metric): add skew
  • fix(metric): skewness and kurtosis when variance=0
  • fix(gan): batch logs
  • fix(activation): config loading
  • feat(lstm): improved flexibility and parameterizing
  • docs(notebook): fresh run
  • ci: bump version to 4.0.7

neuralnetlib 4.0.6

06 Dec 08:30
5ac546b

Choose a tag to compare

  • refactor: autopep8 format
  • ci: bump version to 4.0.6

neuralnetlib 4.0.5

05 Dec 20:48
b4668c2

Choose a tag to compare

  • fix(cosine_sim): division by zero
  • refactor: remove the use of enums
  • docs: huge update and new examples
  • docs(readme): update hyperlinks
  • feat: add SVM
  • fix(Dense): bad variable init
  • fix(Tokenizer): bpe tokenizing
  • fix: jacobian matrix instead of approximation
  • fix(AddNorm): backward pass
  • ci: bump version to 4.0.4

neuralnetlib 4.0.4

03 Dec 18:47
fee19e9

Choose a tag to compare

  • feat(optimizers): add AdaBelief
  • feat(optimizers): add RAdam
  • feat(loss): add focal loss
  • fix(optimizers-losses-activations): from_config
  • feat(ensemble): add XGBoost
  • feat(layer): add ActivationFunction init as string in Activation
  • ci: bump version to 4.0.4

neuralnetlib 4.0.3

03 Dec 17:11
1a28cb7

Choose a tag to compare

  • refactor: some improvements
  • feat: add validation_split parameter to fit method
  • fix(lstm): vanishing and exploding gradient issue
  • docs: update readme
  • ci: bump version to 4.0.3

neuralnetlib 4.0.2

01 Dec 00:33
ffab2fe

Choose a tag to compare

  • feat(ensemble): add AdaBoost
  • feat(ensemble): add GradientBoostingMachine
  • ci: bump version to 4.0.2

neuralnetlib 4.0.1

01 Dec 00:22
f8356fb

Choose a tag to compare

  • docs: add gif into notebook
  • feat(GAN): auto sizing for epoch plot
  • feat: add data imputation
  • feat(ensemble): add IsolationForest
  • feat(ensemble): add RandomForest
  • feat(cluster): add KMeans
  • feat(cluster): add DBSCAN
  • docs: update readme

neuralnetlib 4.0.0

30 Nov 18:28
d841e98

Choose a tag to compare

  • feat: add GAN
  • fix(GAN): weights and backpropagation
  • refactor(BatchNormalization): more stable training
  • fix(GAN): some fixes and general improvements
  • feat(Loss): add WGAN loss
  • feat(WGAN): improve generation
  • ci: bump version to 3.3.9
  • fix(WGAN): discriminator training on both real and fake data
  • feat(GAN): improve image generation
  • fix(Transformer): yet another mode collapse fix
  • feat(layers): better gradient scaling and stability
  • feat(Tokenizer): add BPE tokenize mode
  • docs: update readme
  • ci: bump version to 4.0.0

neuralnetlib 3.3.9

25 Nov 10:22
f7afdd6

Choose a tag to compare

  • feat: add GAN
  • fix(GAN): weights and backpropagation
  • refactor(BatchNormalization): more stable training
  • fix(GAN): some fixes and general improvements
  • feat(Loss): add WGAN loss
  • feat(WGAN): improve generation
  • ci: bump version to 3.3.9

neuralnetlib 3.3.8

23 Nov 16:28
2abbfa5

Choose a tag to compare

  • docs: add embedding to debug notebook
  • refactor: huge simplification of sce to cels which conducted to higher BLUE score
  • fix(Transformer): random state
  • fix: some fixes and improvements
  • fix: some fixes and improvements
  • fix(MultiHeadAttion): attention weights now have correct ranges [-0.7;1.0]
  • fix: some fixes and improvements
  • fix: some fixes and improvements
  • feat: new examples
  • docs: update readme
  • ci: bump version to 3.3.8