Releases: edwinb-ai/LeastSquaresSVM
Releases · edwinb-ai/LeastSquaresSVM
Performance improvements
In this release some performance enhancements were carried out
- Linear algebra operations like the dot product are now called using the BLAS interface.
- Broadcasting operations were implemented using the
Tullio.jl
library in order to fully utilize CPU vectorization instructions. - Checks for solution and convergence of the linear systems is carried out every time.
- Some simple pre-compilation for the
Tullio.jl
based tools are now executed forFloat64
types.
Test dependencies
In this minor release, the test dependency infrastructure was refactored to use the new style of defining test-only dependencies endorsed by the Julia
package manager.
It should make it easier to handle test-only dependencies in future releases.
Name change
- The package is now called
LeastSquaresSVM
to reflect the true purpose of the package. Less ambiguity.
Thanks to @gmagannaDevelop for the contribution!
Multiclass classification
In this version, an implementation for multiclass classification problems is introduced.
- The implementation is based on the one-vs-one approach.
- A new example in the documentation was created to showcase this new feature.
- Tests were implemented accordingly to keep a very high code coverage.
Minor improvements
- Code refactoring to preserve memory.
- Code formatting for clarity.
Quality of life changes
- Instead of using
String
s for the kernel of choice, we are implementingSymbol
s. - The MLJ interface model should now make the user choose between three predefined kernel implementations.
- A new utility to create keyword arguments based on the attributes of a given model was introduced. Should be helpful for development.
- Additional refactoring and simplification in the documentation. Some examples use the easier to use pipeline macro from MLJ.
First release
This is the first tagged release. We outline the features of the package bellow.
- A least-squares implementation of the classic Support Vector Machine, both for classification and regression problems.
- Binary classification available.
- Tight integration with MLJ.jl.
- Can choose betwen RBF, linear and polynomial kernels.
- Good examples and documentation.
- 100% code coverage with tests.