Skip to content

Leaky ReLUs with test case #147

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 16, 2024
Merged

Leaky ReLUs with test case #147

merged 3 commits into from
Apr 16, 2024

Conversation

DrDub
Copy link
Contributor

@DrDub DrDub commented Dec 6, 2023

I really would like to see #105 merged, so here you have it as a clean PR, plus the requested changes (as far as I can understand them).

This is the original patch by @echoline (#105 (comment)) plus a test case.

Please note: from the test case I learn that the default steepness is 0.5 and that value is not useful for ReLUs. That'll be error prone for users, if they set to ReLUs and don't change the steepness nothing will work. Maybe we can add to this PR changing the default value when the activation function is changed?

Original patch by @echoline (libfann#105 (comment)) plus a test case.
src/fann_train.c Outdated
@@ -40,6 +40,10 @@ fann_type fann_activation_derived(unsigned int activation_function, fann_type st
case FANN_LINEAR_PIECE:
case FANN_LINEAR_PIECE_SYMMETRIC:
return (fann_type)fann_linear_derive(steepness, value);
case FANN_LINEAR_PIECE_LEAKY:
return (fann_type)((value < 0) ? steepness * 0.01 : steepness);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be probably good idea to have negative slope tunable. Although it doesn't need to be in this PR as we would use 0.01 as default anyway.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree this would be a good idea, I can open an issue for it and other people can contribute the enhancement.

@bukka
Copy link
Member

bukka commented Feb 21, 2024

Also if you could rebase it, that would be great as GH actions are now merged so it runs the test...

DrDub added 2 commits April 3, 2024 02:24
- Renamed FANN_LINEAR_PIECE_LEAKY to FANN_LINEAR_PIECE_RECT_LEAKY
- Added test case for FANN_LINEAR_PIECE_RECT_LEAKY
@DrDub
Copy link
Contributor Author

DrDub commented Apr 3, 2024

Let me know if you have any other comments or concerns.

Comment on lines +686 to +688
neuron_it->value = (fann_type)((neuron_sum < 0) ? 0
: (neuron_sum > multiplier) ? multiplier
: neuron_sum);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NIT: Formatting should actually be done using ./format.sh script that uses clang-format so this will get overwritten. I will run it after merging.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I should really add check for that to the pipeline.

@bukka bukka merged commit 518eb6b into libfann:master Apr 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants