You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I found an issue while using this network to train my own dataset. As the training progresses, when it reaches around 20 epochs, I noticed that the network's predictions for each sample tend to be close to the same value, around 5. At this point, the sum of all training samples multiplied by their predicted values equals the sum of their label values. I believe the network did not learn properly for each sample and its corresponding label, but when I printed out the sample names and corresponding labels, there was no issue. Have you encountered a similar issue before?
The text was updated successfully, but these errors were encountered:
Hello, I found an issue while using this network to train my own dataset. As the training progresses, when it reaches around 20 epochs, I noticed that the network's predictions for each sample tend to be close to the same value, around 5. At this point, the sum of all training samples multiplied by their predicted values equals the sum of their label values. I believe the network did not learn properly for each sample and its corresponding label, but when I printed out the sample names and corresponding labels, there was no issue. Have you encountered a similar issue before?
Hello, I found an issue while using this network to train my own dataset. As the training progresses, when it reaches around 20 epochs, I noticed that the network's predictions for each sample tend to be close to the same value, around 5. At this point, the sum of all training samples multiplied by their predicted values equals the sum of their label values. I believe the network did not learn properly for each sample and its corresponding label, but when I printed out the sample names and corresponding labels, there was no issue. Have you encountered a similar issue before?
The text was updated successfully, but these errors were encountered: