Replies: 2 comments
-
|
From my point of view you are training the model for few epochs so the behaviour is kind of erratic when you start with fresh model. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
I'm with @nazarPuriy on this. Perhaps trying to train the model for longer might result in better curves? If the model is capable of learning something, the loss should trend downwards over time. Is there anything else you've tried? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Guys, when I plot the training and test accuracies with and without data augmentation these are the graphs I get...

As you can see, the test loss for model 0 is higher than model 1 but test accuracy for model 0 is also higher than model 1. How is that possible?
Beta Was this translation helpful? Give feedback.
All reactions