-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Riemannian Levenberg-Marquardt initial residual and jacobian as kwargs #268
Conversation
To me this looks good. However Mateusz is the expert here, he wrote the algorithm.
Ah, and thanks for your kind words :) While I do care a lot about Manifolds.jl – is Manopt.jl my start of all this and still the package I care about most. So I am very happy when it is useful! |
Ah, and I noticed: Of course new keyword arguments have to be documented in the doctoring in the beginning. We order them in alphabetical order in line 31-37 |
Co-authored-by: Ronny Bergmann <[email protected]>
Co-authored-by: Ronny Bergmann <[email protected]>
Nice, I think this is a good change. Regarding sparsity, I would need to check if you actually properly exploit it. Just using a block array may not be enough, and I haven't tested this implementation with sparse Jacobians. You can take a look here for some comments about sparse variants of LM: https://arxiv.org/pdf/2206.05188.pdf . Do you have an example I could run? |
Codecov Report
@@ Coverage Diff @@
## master #268 +/- ##
=======================================
Coverage 99.70% 99.70%
=======================================
Files 73 73
Lines 6465 6465
=======================================
Hits 6446 6446
Misses 19 19
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
Thanks for the article, I'm sure I'm not properly exploiting sparsity yet, but at least it did improve performance quite a bit. EDIT: #]add IncrementalInference#23Q2/enh/manopt
#]add RoME#master
using RoME, IncrementalInference
fg = generateGraph_Hexagonal(; graphinit=false, landmark=false)
v,result = IIF.solve_RLM_sparse(fg) This generates and solves a small factor graph of a robot driving in a hexagonal, which results in: |
Yes, I think the default is the issue in my case. I have a vector of points on SE(n) and get this error without the change.
|
Thanks for the example. I've taken a look at performance. There seem to be no major issues that can be solved by small changes. It looks like it could be made 2x-3x faster by fixing small issues. For example Do the results look correct in this example though? Or, what is wrong there? I think correctness should be solved first. |
...but could we improve our default that it at least does not error when initialising LM? |
Thanks, I still haven't figured out type stability vs compile time quite yet. Also, still have to figure out how to improve type stability on the CalcFactorManopt struct. I did see it was dynamically dispatching in places.
The hex result is correct. I do not get the correct results in a larger graph on SE(3) yet. I'm still learning Manopt and suspect it might not have converged yet as it does look close. |
Oh I was just referring to the initialisation, not to your solver run. You can easily look at some debug if that helps (for example whether your gradient norm is small). |
@kellertuer, I don't know what defualt initialization will work here, I think it needs the element type, I just used
Doesn't work as it's not on the manifold but Rn. If I'm not misunderstanding RLM completely. What else is needed for this PR, is it just better default values for |
That's why I used M – the manifold one is on and not Rn, but it was also just a guess, since I am still trying to understand which values we need. So here is why I am confused.
I would have thought 2) might also be a collection of points or tangent vectors, hence my So @mateuszbaran what could we best do here to get the right number type even fof |
I would propose merging this PR as-is and make an issue about improving default for initial residuals for later. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
THat's of course also fine with me.
Hi, I wanted to test using Levenberg Marquardt in JuliaRobotics/IncrementalInference.jl#1724 but came across 2 issues that I fixed in this PR.
initial_residual_values = similar(p, get_objective(nlso).num_components)
Didn't work for me on SE(n) with p an ArrayPartion of (SVector,SMatrix)
initial_jacF = similar(p, get_objective(nlso).num_components, manifold_dimension(M)
I wanted to use a sparse Jacobian so I had to supply the initial jacobian myself.
I'm open to suggestions of other ways to do it, but thought I'd rather open a PR than an issue.
PS. I almost have Riemannian Levenberg-Marquardt working as one of our solvers, but I'm not getting the correct results yet, and still need to figure out some performance issues. But thanks for another great addition to julia manifolds.