Releases: marcpinet/neuralnetlib
Releases · marcpinet/neuralnetlib
neuralnetlib 4.0.7
- fix(LSTM): huge improvements in gradient flow
- feat(gradient_norm): better batch handling
- fix(Callbacks): now fully working generically
- fix(metrics): val_loss
- fix(Embedding): bias in output
- fix(display): val_*
- feat(metric): add pearsonr
- feat(metric): add kurtosis
- feat(metric): add skew
- fix(metric): skewness and kurtosis when variance=0
- fix(gan): batch logs
- fix(activation): config loading
- feat(lstm): improved flexibility and parameterizing
- docs(notebook): fresh run
- ci: bump version to 4.0.7
neuralnetlib 4.0.6
- refactor: autopep8 format
- ci: bump version to 4.0.6
neuralnetlib 4.0.5
- fix(cosine_sim): division by zero
- refactor: remove the use of enums
- docs: huge update and new examples
- docs(readme): update hyperlinks
- feat: add SVM
- fix(Dense): bad variable init
- fix(Tokenizer): bpe tokenizing
- fix: jacobian matrix instead of approximation
- fix(AddNorm): backward pass
- ci: bump version to 4.0.4
neuralnetlib 4.0.4
- feat(optimizers): add AdaBelief
- feat(optimizers): add RAdam
- feat(loss): add focal loss
- fix(optimizers-losses-activations): from_config
- feat(ensemble): add XGBoost
- feat(layer): add ActivationFunction init as string in Activation
- ci: bump version to 4.0.4
neuralnetlib 4.0.3
- refactor: some improvements
- feat: add validation_split parameter to fit method
- fix(lstm): vanishing and exploding gradient issue
- docs: update readme
- ci: bump version to 4.0.3
neuralnetlib 4.0.2
- feat(ensemble): add AdaBoost
- feat(ensemble): add GradientBoostingMachine
- ci: bump version to 4.0.2
neuralnetlib 4.0.1
- docs: add gif into notebook
- feat(GAN): auto sizing for epoch plot
- feat: add data imputation
- feat(ensemble): add IsolationForest
- feat(ensemble): add RandomForest
- feat(cluster): add KMeans
- feat(cluster): add DBSCAN
- docs: update readme
neuralnetlib 4.0.0
- feat: add GAN
- fix(GAN): weights and backpropagation
- refactor(BatchNormalization): more stable training
- fix(GAN): some fixes and general improvements
- feat(Loss): add WGAN loss
- feat(WGAN): improve generation
- ci: bump version to 3.3.9
- fix(WGAN): discriminator training on both real and fake data
- feat(GAN): improve image generation
- fix(Transformer): yet another mode collapse fix
- feat(layers): better gradient scaling and stability
- feat(Tokenizer): add BPE tokenize mode
- docs: update readme
- ci: bump version to 4.0.0
neuralnetlib 3.3.9
- feat: add GAN
- fix(GAN): weights and backpropagation
- refactor(BatchNormalization): more stable training
- fix(GAN): some fixes and general improvements
- feat(Loss): add WGAN loss
- feat(WGAN): improve generation
- ci: bump version to 3.3.9
neuralnetlib 3.3.8
- docs: add embedding to debug notebook
- refactor: huge simplification of sce to cels which conducted to higher BLUE score
- fix(Transformer): random state
- fix: some fixes and improvements
- fix: some fixes and improvements
- fix(MultiHeadAttion): attention weights now have correct ranges [-0.7;1.0]
- fix: some fixes and improvements
- fix: some fixes and improvements
- feat: new examples
- docs: update readme
- ci: bump version to 3.3.8