You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In addition to cross entropy loss, there are also 2 regularization terms for KAN model.
Regularization loss is defined for class KAN_Convolutional_Layer in kan_convolutional/KANConv.py. def regularization_loss(self, regularize_activation=1.0, regularize_entropy=1.0): return sum( layer.regularization_loss(regularize_activation, regularize_entropy) for layer in self.layers)
However, in experiment_28x28.ipynb, only cross-entropy loss seems to work. criterion_KKAN_Convolutional_Network = nn.CrossEntropyLoss()
Is it because I missed something?
The text was updated successfully, but these errors were encountered:
Excuse us, the regularization is implemented in another branch (the one called regularization loss), but now it has been outdated because of recent changes that we've done. With some minor tweaks it should work.
In addition to cross entropy loss, there are also 2 regularization terms for KAN model.
Regularization loss is defined for class KAN_Convolutional_Layer in kan_convolutional/KANConv.py.
def regularization_loss(self, regularize_activation=1.0, regularize_entropy=1.0): return sum( layer.regularization_loss(regularize_activation, regularize_entropy) for layer in self.layers)
However, in experiment_28x28.ipynb, only cross-entropy loss seems to work.
criterion_KKAN_Convolutional_Network = nn.CrossEntropyLoss()
Is it because I missed something?
The text was updated successfully, but these errors were encountered: