Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DFGVariable softtype field has abstract type #149

Closed
Affie opened this issue Oct 10, 2019 · 17 comments
Closed

DFGVariable softtype field has abstract type #149

Affie opened this issue Oct 10, 2019 · 17 comments

Comments

@Affie
Copy link
Member

Affie commented Oct 10, 2019

Should DFG not export an abstract type with IIF InferenceType inheriting from it? I don't like that it does not have a type. see https://docs.julialang.org/en/v1/manual/performance-tips/#Avoid-fields-with-abstract-type-1
eg.AbstractSoftType then in IIF InferenceType <: AbstractSoftType
even VariableNodeData{T<:Any} will be better I think.

Originally posted by @Affie in https://github.com/_render_node/MDI0OlB1bGxSZXF1ZXN0UmV2aWV3Q29tbWVudDMzMzQxNDkyMQ==/comments/review_comment

@Affie
Copy link
Member Author

Affie commented Oct 10, 2019

It might be nice if DFGVariable{T<:AbstractSoftType} also. Then the type data is inherently saved in the variable.

@dehann
Copy link
Member

dehann commented Oct 10, 2019

Hi, yes the lack of type is no good for performance. DFGVariable{T<: ... } is the rigth way to go. Not sure if we need to introduce so many layers of abstraction. Just inhereting from InferenceVariable might be good enough. Thanks for bringing this up!

@dehann dehann added this to the CloudGraph Syncing (v0.4.x) milestone Oct 10, 2019
@Affie
Copy link
Member Author

Affie commented Oct 10, 2019

This is why it is blank I thinik, InferenceVariable is in IIF. Thats why the extra layer.
Any is also better, but then its hard to control what users try to put in there.

@dehann
Copy link
Member

dehann commented Oct 10, 2019

InferenceVariable will likely move to DFG if it hasn't already happened.

@dehann
Copy link
Member

dehann commented Oct 10, 2019

this issue will likely undo some of #148, but at this point I want to stop talking and actually code some stuff.

@dehann
Copy link
Member

dehann commented Oct 10, 2019

abstract type InferenceVariable end

not sure why it is in DFGFactor.jl, but not a concern right now.

@Affie
Copy link
Member Author

Affie commented Oct 10, 2019

Ok, then its not a problem. I didn't think to look in DFGFactor.jl at all.

@dehann
Copy link
Member

dehann commented Oct 10, 2019

yeah, weird location, but probably some good reason somewhere.

@Affie
Copy link
Member Author

Affie commented Oct 10, 2019

I'll quickly fix issue.

@Affie
Copy link
Member Author

Affie commented Oct 10, 2019

Unfortunately it would be a breaking change:

function addVariable!(dfg::G,
                      lbl::Symbol,
                      softtype::T;
                      N::Int=100,
                      autoinit::Bool=true,  # does init need to be separate from ready? TODO
                      ready::Int=1,
                      dontmargin::Bool=false,
                      labels::Vector{Symbol}=Symbol[],
                      smalldata=Dict{String, String}(),
                      checkduplicates::Bool=true  )::DFGVariable where
                        {G <: AbstractDFG,
                         T <: InferenceVariable}
  #
  v = DFGVariable(lbl)
#needs to add the parameter or softtype 
DFGVariable{T}(lbl)
DFGVariable(lbl, softype)

@Affie Affie modified the milestones: CloudGraph Syncing (v0.4.x), v0.5.0 Oct 10, 2019
@Affie
Copy link
Member Author

Affie commented Oct 10, 2019

Is it ok if i made the milestone 0.5.0? It’s mostly performance and is likely to break a lot of code. Depreciate without fix to use DFGVariable(lbl, softype) in 0.4.x and then break in 0.5.0?

@Affie
Copy link
Member Author

Affie commented Oct 11, 2019

see #149. I still think making the type parametric will be of advantage since it is used in the core algorithm.

@dehann
Copy link
Member

dehann commented Oct 13, 2019

use DFGVariable(lbl, softype)

I'm not too hot on double template here. Single {T} for variable softtype is okay.

the type parametric will be of advantage since it is used in the core algorithm

PPE is not used in the core algorithm, if thats what you are referring too. PPE is for user convenience -- reduction from the marginal belief estimate.

@Affie
Copy link
Member Author

Affie commented Oct 13, 2019

I’m not too hot on double template here...

I don’t know how to do it then. Is the whole DFGVariable not of one type.
EDIT: This is the best I can think of:

solverDataDict::Dict{Symbol, VariableNodeData{<:InferenceVariable}}

I still don't see the problem with <:InferenceVariable

PPE is not used in the core algorithm, if thats what you are referring too. PPE is for user convenience -- reduction from the marginal belief estimate.

DFGvariable has an abstract type softtype::Any. It is used everywhere not just for ppe (ppe does not have a softtype field). Or am i missing something?

@Affie
Copy link
Member Author

Affie commented Oct 14, 2019

Quickly cleaned up and committed what I had so far, see commit: dcd97da

@dehann
Copy link
Member

dehann commented Oct 14, 2019

I still don't see the problem with <:InferenceVariable

Bad:

solverDataDict::Dict{Symbol, VariableNodeData{<:InferenceVariable}}

Which is very different from templating:

Good:

mutable struct VariableNodeData{T<:InferenceVariable}
  ...
  softtype::T
  ...
end

@dehann
Copy link
Member

dehann commented Oct 14, 2019

with this:

I'm not too hot on double template here

I meant: the PPE compat design issue is not solved by introducing templating. We argued it out. You would have to change the (now old) VariableEstimate again and again and again, which will require new minor revision number every time. The only way to avoid that (which we could come up with) is to use a dict. Hence PPE changed to PointParametricEstimate which now contains a Dict (thats PR #158).

  • the performance of something like MyType{T1 <: Abstract9, T2 <: Abstract3} is very good;
  • the performance of containers with abstract types is bad, struct BadType val::Abstract end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants