Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature prior #130

Merged
merged 7 commits into from
Aug 26, 2019
Merged

Feature prior #130

merged 7 commits into from
Aug 26, 2019

Conversation

paulstapor
Copy link
Contributor

@paulstapor paulstapor commented Aug 20, 2019

Added some functionality to sample starting points according to priors, added files for a test (Fixes #17 )

@paulstapor paulstapor requested a review from dweindl August 20, 2019 16:42
@codecov-io
Copy link

codecov-io commented Aug 20, 2019

Codecov Report

Merging #130 into develop will increase coverage by 1.13%.
The diff coverage is 92%.

Impacted file tree graph

@@             Coverage Diff             @@
##           develop     #130      +/-   ##
===========================================
+ Coverage     67.3%   68.44%   +1.13%     
===========================================
  Files           11       11              
  Lines         1040     1090      +50     
  Branches       241      256      +15     
===========================================
+ Hits           700      746      +46     
- Misses         305      307       +2     
- Partials        35       37       +2
Impacted Files Coverage Δ
petab/core.py 78.91% <92%> (+2.48%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c378f90...3ca55d2. Read the comment docs.

@paulstapor paulstapor requested review from yannikschaelte, LeonardSchmiester and jvanhoefer and removed request for dweindl August 26, 2019 10:22
sp = scale(np.random.normal(loc=p_params[0], scale=p_params[1],
size=(n_starts,)))

elif p_type == 'logNormal':
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to decide on a consistent parametrization of the distributions.
I understood from our discussion (might be my misunderstanding...), that we agreed on another parametrization, but I prefer your parametrization here. What do you think @LeonardSchmiester? :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought that was the way we agreed on it: Introducing log-normal as an exponentiated normal distribution on the linear parameter space? What should be different?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought that we parametrized it as it is parametrized in scipy. But as I said, I prefer your version and just wanted to make sure, that Leonard implements the same parametrization. I will adapt my code accordingly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah...
Yes, scipy does it differently...
Yes, but we do everything via numpy here, so I would also prefer not to get involved with yet another package which has redundant functionality, but uses a different definition...

Copy link
Contributor

@jvanhoefer jvanhoefer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks fine for me, approved, if we agree on this parametrization of the logNormal distribution (which I prefer anyhow!)

@dweindl
Copy link
Member

dweindl commented Aug 26, 2019

@paulstapor : Can you please include a section in the documentation on the (meaning of the) supported values for the respective columns?

@paulstapor
Copy link
Contributor Author

@paulstapor : Can you please include a section in the documentation on the (meaning of the) supported values for the respective columns?

Indeed, that's still missing. Will do that before merging!

@paulstapor
Copy link
Contributor Author

Okay, I updated the documentation, also about the additional columns we're actually still discussing about (initializationPriorType, initializationPriorParameters, objectivePriorType, and objectivePriorParameters). This is yet to come, but fairly simple to implement. I think/hope, it was safe to include this in the "Extensions" part of the documentation...

@paulstapor
Copy link
Contributor Author

Can we agree on that this fixes now #17 ?

@dweindl
Copy link
Member

dweindl commented Aug 26, 2019

Can we agree on that this fixes now #17 ?

Okay for me.

@paulstapor paulstapor changed the title Feature prior Feature prior, fixes #17 Aug 26, 2019
@paulstapor paulstapor changed the title Feature prior, fixes #17 Feature prior Aug 26, 2019
@paulstapor
Copy link
Contributor Author

paulstapor commented Aug 26, 2019

May/Can/Shall I merge?
Or should I first address #131 ?

@paulstapor
Copy link
Contributor Author

May/Can/Shall I merge?
Or should I first address #131 ?

Thats a PEtab issue. Merging in pesto should be done independently of that.

Both things (this PR and the Issue) are PEtab things...
I don't see, where pypesto comes in here...

@FFroehlich
Copy link
Collaborator

May/Can/Shall I merge?
Or should I first address #131 ?

Thats a PEtab issue. Merging in pesto should be done independently of that.

Both things (this PR and the Issue) are PEtab things...
I don't see, where pypesto comes in here...

Yeah, sorry. Just ignore my comment 🤦‍♂

@paulstapor paulstapor merged commit ce384bf into develop Aug 26, 2019
@paulstapor paulstapor deleted the feature_prior branch August 26, 2019 19:05
elbaraim pushed a commit that referenced this pull request Oct 4, 2019
* added an initial parameter sampling functionality

* add tests for startpoint sampling

* added documentation for the additional columns intialization and objectivePriorType and Parameters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants