Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

solveTree! only returns tree #1381

Merged
merged 12 commits into from
Sep 10, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ The list below highlights major breaking changes, and please note that significa
- Deprecating `approxConvBinary`, use `approxConvBelief` instead.
- Removing obsolete `approxConvCircular`, use `approxConvBelief` instead.
- `getSample` should return a single sample and no longer takes the N(number of samples) parameter.
- `solveTree!` / `solveGraph!` now returns just one value `tree<:AbstractBayesTree`. Previous version returned three values, `tree, smt, hist` (#1379).

# Major changes in v0.24

Expand Down
2 changes: 1 addition & 1 deletion examples/BayesTreeIllustration.jl
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ tree = buildTreeReset!(fg, drawpdf=true, show=true)
# solve the factor graph and show solving progress on tree in src/JunctionTree.jl
fg.solverParams.showtree = true
fg.solverParams.drawtree = true
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)


## building a new tree -- as per IIF.prepBatchTree(...)
Expand Down
2 changes: 1 addition & 1 deletion examples/FixedPointIllustrationsSquare.jl
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ function runFullBatchIterations(;N=100, iters=50)

FG = deepcopy(fg)
for i in 1:iters
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)
push!(FG, deepcopy(fg))
end
return FG
Expand Down
2 changes: 1 addition & 1 deletion examples/IllustrateAutoInit.jl
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ plotKDE(fg, [:x0, :x1, :x2, :x3])

# Find global best likelihood solution (posterior belief)
# After defining the problem, we can find the 'minimum free energy' solution
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# and look at the posterior belief, and notice which consensus modes stand out in the posterior
plotKDE(fg, [:x0, :x1, :x2, :x3])
Expand Down
2 changes: 1 addition & 1 deletion examples/MultiHypo2Door.jl
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ drawGraph(fg, show=true)
tree = buildTreeReset!(fg, drawpdf=true, show=true)

## Solve graph
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# tree = buildTreeReset!(fg, drawpdf=true, show=true)

Expand Down
2 changes: 1 addition & 1 deletion examples/MultiHypo3Door.jl
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ tree = buildTreeReset!(fg, drawpdf=true, show=true)


## Solve graph
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

## Plotting functions below

Expand Down
2 changes: 1 addition & 1 deletion examples/RobotFourDoor.jl
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ plotKDE(fg, :x4)
addFactor!(fg,[:x4], doorPrior)

# solve over all data
tree, smt, hists = solveTree!(fg)
tree = solveTree!(fg)

# list variables and factors in fg
@show ls(fg) # |> sortDFG
Expand Down
2 changes: 1 addition & 1 deletion examples/squarefixedpoint.jl
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ doautoinit!(fg, :xy)
initManual!(fg, :x, randn(1,100))

# find solution
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

## plot the result
plotKDE(map(x->getKDE(fg,x), [:x; :xy]))
10 changes: 7 additions & 3 deletions src/IncrementalInference.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,11 @@ using Reexport

using Manifolds

export ℝ, AbstractManifold, Euclidean, Circle
export ℝ, AbstractManifold
# common groups -- preferred defaults at this time.
export TranslationGroup, CircleGroup
# common non-groups -- TODO still teething problems to sort out in IIF v0.25-v0.26.
export Euclidean, Circle

import NLsolve
import NLSolversBase
Expand Down Expand Up @@ -119,9 +123,9 @@ export FunctorInferenceType, PackedInferenceType
export AbstractPrior, AbstractRelative, AbstractRelativeRoots, AbstractRelativeMinimize

# not sure if this is necessary
export convert
export convert, *

export *,
export
CSMHistory,
# getTreeCliqsSolverHistories,

Expand Down
12 changes: 6 additions & 6 deletions src/SolverAPI.jl
Original file line number Diff line number Diff line change
Expand Up @@ -254,7 +254,7 @@ DevNotes
Example
```julia
# pass in old `tree` to enable compute recycling -- see online Documentation for more details
tree, smt, hist = solveTree!(fg [,tree])
tree = solveTree!(fg [,tree])
```

Related
Expand Down Expand Up @@ -372,8 +372,7 @@ function solveTree!(dfgl::AbstractDFG,
oldtree.eliminationOrder = tree.eliminationOrder
oldtree.buildTime = tree.buildTime

hist = !opt.async ? fetchCliqHistoryAll!(smtasks) : hist


if opt.drawtree && opt.async
@warn "due to async=true, only keeping task pointer, not stopping the drawtreerate task! Consider not using .async together with .drawtreerate != 0"
push!(smtasks, treetask)
Expand All @@ -383,10 +382,11 @@ function solveTree!(dfgl::AbstractDFG,

# if debugging and not async then also print the CSMHistory
if opt.dbg && !opt.async
printCSMHistorySequential(hist, joinLogPath(dfgl,"HistoryCSMAll.txt") )
hists = !opt.async ? fetchCliqHistoryAll!(smtasks) : hist
printCSMHistorySequential(hists, joinLogPath(dfgl,"HistoryCSMAll.txt") )
end

return oldtree, smtasks, hist
return oldtree
end

"""
Expand Down
2 changes: 1 addition & 1 deletion src/TreeDebugTools.jl
Original file line number Diff line number Diff line change
Expand Up @@ -688,7 +688,7 @@ fg = initfg()

fsy = getTreeAllFrontalSyms(fg, tree) # for later use
# perform inference to find the factor graph marginal posterior estimates
tree, smt, hist = solveTree!(fg, recordcliqs=fsy)
tree = solveTree!(fg, recordcliqs=fsy)

# generate frames in standard location /tmp/caesar/csmCompound/
# requires: sudo apt-get install graphviz
Expand Down
8 changes: 8 additions & 0 deletions src/services/HeatmapSampler.jl
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,10 @@ function fitKDE(support,
kde!(support, kernel_bw, weights)
end


global allthres = Dict{Int, Any}()


function HeatmapDensityRegular( data::AbstractMatrix{<:Real},
domain::Tuple{<:AbstractVector{<:Real},<:AbstractVector{<:Real}},
level::Real,
Expand All @@ -68,8 +72,12 @@ function HeatmapDensityRegular( data::AbstractMatrix{<:Real},
bw_factor::Real=0.7, # kde spread between domain points
N::Int=10000 )
#
global allthres

# select the support from raw data
support_, weights_, roi = getLevelSetSigma(data, level, sigma, domain...; sigma_scale=sigma_scale)
allthres[length(allthres)+1] = roi

# constuct a pre-density from which to draw intermediate samples
density_ = fitKDE(support_, weights_, domain...; bw_factor=bw_factor)
pts_preIS, = sample(density_, N)
Expand Down
2 changes: 1 addition & 1 deletion test/TestCSMMultihypo.jl
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ getSolverParams(fg).limititers = 30 # previous runaway CSM issue due to excessiv
# getSolverParams(fg).drawtree = false
# getSolverParams(fg).showtree = false

tree, smt, hist = solveTree!(fg, recordcliqs=ls(fg))
tree = solveTree!(fg, recordcliqs=ls(fg))


# drawGraph(fg)
Expand Down
2 changes: 1 addition & 1 deletion test/fourdoortest.jl
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ addFactor!(fg,[:x3;:x4], LinearRelative( Normal(200.0,4.0)))
addFactor!(fg,[:x4], doorPrior)

# solve over all data
tree, smt, hists = solveTree!(fg)
tree = solveTree!(fg)

##

Expand Down
4 changes: 2 additions & 2 deletions test/priorusetest.jl
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ addFactor!(fg, [:x0; :x1], LinearRelative(Normal(0.0, 0.01)))
addFactor!(fg, [:x1; :x2], LinearRelative(Normal(0.0, 0.01)))

#solve
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)
x0_m = getKDEMean(getKDE(getVariable(fg, :x0)))[1]
x1_m = getKDEMean(getKDE(getVariable(fg, :x1)))[1]
x2_m = getKDEMean(getKDE(getVariable(fg, :x2)))[1]
Expand Down Expand Up @@ -88,7 +88,7 @@ addFactor!(fg, [:x2; :l0], LinearRelative(Normal(0, 0.01)))
addFactor!(fg, [:x2; :l1], LinearRelative(Normal(0, 0.01)))

#solve
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

x0_m = getKDEMean(getKDE(getVariable(fg, :x0)))[1]
x1_m = getKDEMean(getKDE(getVariable(fg, :x1)))[1]
Expand Down
2 changes: 1 addition & 1 deletion test/testBasicCSM.jl
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ getSolverParams(dfg).limititers = 50
## getSolverParams(dfg).async = true


tree, smtasks, hist = solveTree!(dfg) #, recordcliqs=ls(dfg))
tree = solveTree!(dfg) #, recordcliqs=ls(dfg))

pts_ = getBelief(dfg, :c) |> getPoints
TensorCast.@cast pts[i,j] := pts_[j][i]
Expand Down
24 changes: 12 additions & 12 deletions test/testBasicGraphs.jl
Original file line number Diff line number Diff line change
Expand Up @@ -28,12 +28,12 @@ addFactor!(fg, [:x0;], Prior(Normal(0.0,1.0)))
@test !isSolved(getVariable(fg, :x0))

# run solver once
tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

@test getSolvedCount(fg, :x0) == 1
@test isSolved(fg, :x0)

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

@test getSolvedCount(fg, :x0) == 2
@test isSolved(fg, :x0)
Expand Down Expand Up @@ -61,7 +61,7 @@ fg = initfg()
addVariable!(fg, :x0, ContinuousScalar)
addFactor!(fg, [:x0;], Prior(Normal(1000.0,1.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# check mean and covariance
@test abs((getBelief(fg, :x0) |> getKDEMean)[1]-1000) < 0.5
Expand All @@ -80,7 +80,7 @@ addVariable!(fg, :x0, ContinuousScalar)
addFactor!(fg, [:x0;], Prior(Normal(0.0,1.0)))
addFactor!(fg, [:x0;], Prior(Normal(0.0,1.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# check mean and covariance
@test (getBelief(fg, :x0) |> getKDEMean .|> abs)[1] < 0.4
Expand All @@ -101,7 +101,7 @@ addFactor!(fg, [:x0;], Prior(Normal(0.0,1.0)))
addFactor!(fg, [:x0;], Prior(Normal(0.0,1.0)))
addFactor!(fg, [:x0;], Prior(Normal(0.0,1.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# check mean and covariance
@test (getBelief(fg, :x0) |> getKDEMean .|> abs)[1] < 0.4
Expand All @@ -122,7 +122,7 @@ addVariable!(fg, :x0, ContinuousScalar)
addFactor!(fg, [:x0;], Prior(Normal(-1.0,1.0)))
addFactor!(fg, [:x0;], Prior(Normal(+1.0,1.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# check mean and covariance -- should be zero
@test (getBelief(fg, :x0) |> getKDEMean .|> abs)[1] < 0.8
Expand All @@ -142,7 +142,7 @@ addVariable!(fg, :x0, ContinuousScalar)
addFactor!(fg, [:x0;], Prior(Normal(-1.0-1000,1.0)))
addFactor!(fg, [:x0;], Prior(Normal(+1.0-1000,1.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# check mean and covariance -- should be zero
@test abs((getBelief(fg, :x0) |> getKDEMean)[1] + 1000) < 0.6
Expand All @@ -165,7 +165,7 @@ addFactor!(fg, [:x0;], Prior(Normal(0.0,1.0)))
addFactor!(fg, [:x1;], Prior(Normal(0.0,1.0)))
addFactor!(fg, [:x0;:x1;], LinearRelative(Normal(0.0,10.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# check mean and covariance -- should be zero
@test (getBelief(fg, :x0) |> getKDEMean .|> abs)[1] < 0.6
Expand All @@ -191,7 +191,7 @@ addFactor!(fg, [:x0;], Prior(Normal(-1.0,1.0)))
addFactor!(fg, [:x1;], Prior(Normal(+1.0,1.0)))
addFactor!(fg, [:x0;:x1;], LinearRelative(Normal(0.0,10.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# check mean and covariance -- should be near each prior
@test abs((getBelief(fg, :x0) |> getKDEMean)[1]+1) < 0.75
Expand Down Expand Up @@ -220,7 +220,7 @@ addFactor!(fg, [:x2;], Prior(Normal(+1.0,1.0)))
addFactor!(fg, [:x0;:x1;], LinearRelative(Normal(0.0,1.0)))
addFactor!(fg, [:x1;:x2;], LinearRelative(Normal(0.0,1.0)))

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)



Expand Down Expand Up @@ -264,7 +264,7 @@ addFactor!(fg, [:x3;:x4;], LinearRelative(Normal(0.0,1.0)))
# #1196
drawGraph(fg, filepath="testgraphplot/myfg.dot", show=false)

tree, smt, hist = solveTree!(fg, storeOld=true)
tree = solveTree!(fg, storeOld=true)

# using KernelDensityEstimatePlotting
# plotKDE((x->getBelief(fg,x)).([:x0;:x1;:x2;:x3;:x4]))
Expand Down Expand Up @@ -331,7 +331,7 @@ pts_ = getPoints(getBelief(fg, :x0))
TensorCast.@cast pts[i,j] := pts_[j][i]
X0 = pts |> deepcopy

tree, smt, hist = solveTree!(fg)
tree = solveTree!(fg)

# values after solve
pts_ = getPoints(getBelief(fg, :x0))
Expand Down
6 changes: 3 additions & 3 deletions test/testBasicParametric.jl
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ foreach(x->getSolverData(getVariable(fg,x.first),:parametric).val[1] .= x.second
# getSolverParams(fg).async = true
getSolverParams(fg).graphinit = false

tree2, smt, hist = IIF.solveTree!(fg; algorithm = :parametric) #, recordcliqs=ls(fg))
tree2 = IIF.solveTree!(fg; algorithm = :parametric) #, recordcliqs=ls(fg))


for i in 0:10
Expand Down Expand Up @@ -150,7 +150,7 @@ foreach(x->getSolverData(getVariable(fg,x.first),:parametric).val[1] .= x.second
# global smt
# global hist
#force message passing with manual variable order
tree2, smt, hist = solveTree!(fg; algorithm=:parametric, eliminationOrder=[:x0, :x2, :x1])
tree2 = solveTree!(fg; algorithm=:parametric, eliminationOrder=[:x0, :x2, :x1])
# end
foreach(v->println(v.label, ": ", DFG.getSolverData(v, :parametric).val), getVariables(fg))

Expand Down Expand Up @@ -189,7 +189,7 @@ foreach(x->getSolverData(getVariable(fg,x.first),:parametric).val[1] .= x.second
# fg.solverParams.drawtree = true
# fg.solverParams.dbg = false
getSolverParams(fg).graphinit = false
tree2, smt, hist = IIF.solveTree!(fg; algorithm=:parametric)
tree2 = IIF.solveTree!(fg; algorithm=:parametric)

# print results
if false
Expand Down
Loading