Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sync up with fork #37

Merged
merged 53 commits into from
Feb 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
55f1247
added link to ilastik CI
k-dominik Oct 2, 2018
38d4b7c
Remove sys.path modification from library
m-novikov Nov 15, 2018
ef95470
Merge pull request #5 from m-novikov/fix-sys-path
k-dominik Nov 23, 2018
35f0103
added circleci badge
k-dominik Nov 23, 2018
8ba2735
cleanup, remove future imports
k-dominik Feb 25, 2019
7410013
refactor: getLogger-> logger
k-dominik Feb 26, 2019
8380c9f
black
k-dominik Feb 26, 2019
d949ccc
cleanup: removed unused imports
k-dominik Feb 26, 2019
5ff2b85
quickfix: load json
k-dominik Feb 26, 2019
d87cb0e
refactor networkx 2.0
k-dominik Feb 26, 2019
ff2026d
refactor: networkx 2.0
k-dominik Feb 26, 2019
8790f38
black
k-dominik Feb 26, 2019
45ba2dd
upgrade networkx, add jinja2 to deps
k-dominik Feb 26, 2019
d229e84
refactor, fix tests
k-dominik Feb 26, 2019
f1992a0
Merge branch 'upgrade-networkx' into v0.1.1
k-dominik Feb 26, 2019
fa52e12
ref to ilastik, version bump
k-dominik Feb 26, 2019
598d42f
ci gymnastics: build, test, install
k-dominik Feb 26, 2019
f820de0
added exception with sensible message if division and merger
k-dominik Feb 28, 2019
b8121dc
bump build
k-dominik Mar 4, 2019
a60dfcb
Merge pull request #6 from ilastik/v0.1.1
k-dominik Apr 1, 2019
0dc28de
added appveyor ci for windows
k-dominik Apr 1, 2019
3d08480
Fix windows pickling issue
m-novikov Apr 2, 2019
3a395a2
Fix build script
m-novikov Apr 2, 2019
2e11978
Add ilastik-forge channel
m-novikov Apr 2, 2019
1a2b8b2
Merge pull request #7 from m-novikov/fix-windows-mp
m-novikov Apr 2, 2019
4628e4b
bump build
k-dominik Apr 4, 2019
9ffbfa1
refactored some remaining networkx1.x syntax
k-dominik Apr 5, 2019
fcc763a
bump build
k-dominik Apr 5, 2019
ae7ec43
Merge branch 'quick-fixes'
k-dominik Apr 5, 2019
f348519
fix remove_edges_from called with view
k-dominik Jul 9, 2019
6c05e9b
upd: ci
k-dominik Jul 9, 2019
e272f3f
KDTree interface change
k-dominik Jul 9, 2019
f43b88c
Merge pull request #8 from ilastik/fix-remove-edges
k-dominik Jul 10, 2019
a1b9787
h5py deprecation of dataset.value -> dataset[()]
k-dominik Oct 29, 2021
f583d40
changes to allow networkx > 2.3
k-dominik Oct 29, 2021
1608874
mnt: clear python pin
k-dominik Oct 29, 2021
98a82d1
fix tests
k-dominik Oct 29, 2021
bb6ddff
add black config, devenv
k-dominik Oct 29, 2021
db64a24
rename reserved kw
k-dominik Oct 29, 2021
1981241
couldn't look at it anymore, applied black
k-dominik Oct 29, 2021
bbdfa34
fix tests on win
k-dominik Oct 29, 2021
3d9f162
run tests during conda-bld
k-dominik Nov 2, 2021
dc0ad6d
fix: float comparison in tests
k-dominik Nov 2, 2021
3a9ebe9
Merge pull request #10 from ilastik/py38py39
k-dominik Nov 2, 2021
806c48a
added random seed for ilastik to resolve merges reproducibly
k-dominik Jan 27, 2023
392dda3
make sure to index with tuple
k-dominik Jan 27, 2023
2c31fb4
consistent object labeling
k-dominik Jan 30, 2023
15d7cfe
remove circleci ci, appveyor, prepare for gh actions
k-dominik Jan 30, 2023
b888e7b
added test for all three osses (via conda bld)
k-dominik Jan 30, 2023
04943cd
add deploy workflow
k-dominik Jan 30, 2023
0e70564
update some niceness
k-dominik Jan 30, 2023
9fb47e0
allow for 2ulp accuracy
k-dominik Jan 30, 2023
fd7342d
Merge pull request #11 from ilastik/reproducible-merger-resolvers
k-dominik Jan 30, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
56 changes: 0 additions & 56 deletions .circleci/config.yml

This file was deleted.

35 changes: 35 additions & 0 deletions .github/workflows/deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
name: deploy

on:
push:
tags:
- '*'


jobs:
deploy-to-ilastik-forge:
# noarch build - build on linux, only
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- uses: actions/checkout@v3
with:
repository: ilastik/ilastik-conda-recipes
path: ilastik-conda-recipes
- uses: conda-incubator/setup-miniconda@v2
with:
auto-update-conda: true
auto-activate-base: true
activate-environment: ""
miniforge-variant: Mambaforge
use-mamba: true
- name: linux conda build and upload
shell: bash -l {0}
env:
ANACONDA_API_TOKEN: ${{ secrets.ANACONDA_TOKEN }}
run: |
mamba install -n base -c conda-forge boa setuptools_scm anaconda-client -y
mamba config --set anaconda_upload yes
conda mambabuild -c ilastik-forge -c conda-forge -m ilastik-conda-recipes/ilastik-pins.yaml --user ilastik-forge conda-recipe
40 changes: 40 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
name: test

on:
push:
branches: [ main ]
pull_request:
branches: [ main ]

jobs:
test-w-conda-recipe:
strategy:
fail-fast: false
matrix:
os: [macos-latest, windows-latest, ubuntu-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- uses: conda-incubator/setup-miniconda@v2
with:
activate-environment: ""
auto-activate-base: true
auto-update-conda: true
miniforge-variant: Mambaforge
use-mamba: true
- name: install build deps
run: mamba install -n base -c conda-forge boa setuptools_scm -y
- name: linux conda build test
if: matrix.os == 'ubuntu-latest'
shell: bash -l {0}
run: conda mambabuild -c ilastik-forge -c conda-forge conda-recipe
- name: osx test
if: matrix.os == 'macos-latest'
shell: bash -l {0}
run: conda mambabuild -c ilastik-forge -c conda-forge conda-recipe
- name: windows conda-build
if: matrix.os == 'windows-latest'
shell: cmd /C CALL {0}
run: conda mambabuild -c ilastik-forge -c conda-forge conda-recipe
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
CMakeCache*
*.pyc
*_test
__pycache__
tags*
Testing
CMakeFiles
Expand Down
4 changes: 3 additions & 1 deletion Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@

By Carsten Haubold, Steffen Wolf, Letitia Parcalabescu, Bernhard Kausler, Martin Schiegg, Jaime I. Cervantes, Janez Ales and more.

* Build status: [ ![Circle CI](https://circleci.com/gh/chaubold/hytra.png?style=shield&circle-token=27b4fff289dfdb41575cecfab8e865c7cac6a099) ](https://circleci.com/gh/chaubold/hytra)

* build status: [![test](https://github.com/ilastik/hytra/actions/workflows/test.yml/badge.svg)](https://github.com/ilastik/hytra/actions/workflows/test.yml)
* conda: ![last updated](https://anaconda.org/ilastik-forge/hytra/badges/latest_release_date.svg) ![latest-version](https://anaconda.org/ilastik-forge/hytra/badges/version.svg)
* Usage documentation can be found in this [Google document](https://docs.google.com/document/d/1jxkYGlTEUCPqH03pip03eDBBX2pVYEhPGHHvbegHiWw/edit?usp=sharing)
* [API Docs](http://chaubold.github.io/hytra/hytra/index.html)
* Run tests using `nosetests tests` from the root folder
8 changes: 1 addition & 7 deletions conda-recipe/conda_build_config.yaml
Original file line number Diff line number Diff line change
@@ -1,8 +1,2 @@
networkx:
- 1.11
python:
- 3.6


pin_run_as_build:
python: x.x
- 2
17 changes: 8 additions & 9 deletions conda-recipe/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,18 +21,17 @@ build:

requirements:
build:
- python {{ python }}
- python >=3.6
- pip

run:
- python >=2.7
- dpct
- networkx <={{ networkx }}
- yapsy
- vigra
- scikit-learn
- scikit-image
- h5py
- networkx >={{ networkx }}
- python >=3.6
- scikit-image
- scikit-learn
- vigra
- yapsy

test:
source_files:
Expand All @@ -50,6 +49,6 @@ test:
- nosetests tests

about:
home: https://github.com/chaubold/hytra
home: https://github.com/ilastik/hytra
license: MIT
summary: 'Python tracking framework developed at the IAL lab @ University of Heidelberg'
19 changes: 19 additions & 0 deletions dev/environment-dev.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
name: hytra-dev
channels:
- ilastik-forge
- conda-forge
- defaults
dependencies:
- black
- configargparse
- dpct
- h5py
- jinja2
- networkx >=2.2
- nose
- pre_commit
- python >=3.7
- scikit-image
- scikit-learn
- vigra
- yapsy
8 changes: 4 additions & 4 deletions empryonic/filter.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ def filterFeaturesByIntensity(h5In, h5Out, threshold = 1500):
h5Out: output path; file will be overwritten if already existing
'''
def intensityFilter(labelGroup):
intMaximum = labelGroup[intminmax].value[1]
intMaximum = labelGroup[intminmax][()][1]
return (intMaximum >= threshold)
filterFeaturesByPredicate(h5In, h5Out, intensityFilter)

Expand Down Expand Up @@ -81,17 +81,17 @@ def filterFeaturesByPredicate(h5In, h5Out, predicate):

# supervoxels
print "labelcount = ", labelcount
outFeaturesGroup.create_dataset(labelcount, data=featuresGroup[labelcount].value)
outFeaturesGroup.create_dataset(labelcount, data=featuresGroup[labelcount][()])

# featureconfig
outFeaturesGroup.create_dataset(featureconfig, data=featuresGroup[featureconfig].value)
outFeaturesGroup.create_dataset(featureconfig, data=featuresGroup[featureconfig][()])

# labels
for labelGroup in validLabelGroups:
outFile.copy(labelGroup, outFeaturesGroup)

# labelcontent
inLabelcontent = featuresGroup[labelcontent].value
inLabelcontent = featuresGroup[labelcontent][()]
outLabelcontent = np.zeros(inLabelcontent.shape, dtype=inLabelcontent.dtype)

validLabels = filter(lambda item: item.isdigit(), outFeaturesGroup.keys())
Expand Down
32 changes: 16 additions & 16 deletions empryonic/io.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ def __loadDataset( filename, h5path):
h5path: location of the data inside the hdf5 file
'''
f = h5py.File( filename, mode='r' )
data = f[h5path].value
data = f[h5path][()]
f.close()
return data

Expand Down Expand Up @@ -214,25 +214,25 @@ def update_moves( self, mov_pairs ):

def get_moves( self ):
if self.has_tracking() and _path.basename(self.mov_ds) in self[self.track_gn].keys():
return self[self.mov_ds].value
return self[self.mov_ds][()]
else:
return np.empty(0)

def get_mergers( self ):
if self.has_tracking() and _path.basename(self.merg_ds) in self[self.track_gn].keys():
return self[self.merg_ds].value
return self[self.merg_ds][()]
else:
return np.empty(0)

def get_multiFrameMoves( self ):
if self.has_tracking() and _path.basename(self.multi_ds) in self[self.track_gn].keys():
return self[self.multi_ds].value
return self[self.multi_ds][()]
else:
return np.empty(0)

def get_move_energies( self ):
if _path.basename(self.mov_ener_ds) in self[self.track_gn].keys():
e = self[self.mov_ener_ds].value
e = self[self.mov_ener_ds][()]
if isinstance(e, np.ndarray):
return e
else:
Expand All @@ -243,7 +243,7 @@ def get_move_energies( self ):

def get_divisions( self ):
if self.has_tracking() and _path.basename(self.div_ds) in self[self.track_gn].keys():
return self[self.div_ds].value
return self[self.div_ds][()]
else:
return np.empty(0)

Expand All @@ -255,7 +255,7 @@ def update_divisions( self, div_triples ):

def get_division_energies( self ):
if _path.basename(self.div_ener_ds) in self[self.track_gn].keys():
e = self[self.div_ener_ds].value
e = self[self.div_ener_ds][()]
if isinstance(e, np.ndarray):
return e
else:
Expand All @@ -265,7 +265,7 @@ def get_division_energies( self ):

def get_disappearances( self ):
if self.has_tracking() and _path.basename(self.dis_ds) in self[self.track_gn].keys():
dis = self[self.dis_ds].value
dis = self[self.dis_ds][()]
if isinstance(dis, np.ndarray):
return dis
else:
Expand All @@ -281,7 +281,7 @@ def update_disappearances( self, dis_singlets ):

def get_disappearance_energies( self ):
if _path.basename(self.dis_ener_ds) in self[self.track_gn].keys():
e = self[self.dis_ener_ds].value
e = self[self.dis_ener_ds][()]
if isinstance(e, np.ndarray):
return e
else:
Expand All @@ -292,7 +292,7 @@ def get_disappearance_energies( self ):

def get_appearances( self ):
if self.has_tracking() and _path.basename(self.app_ds) in self[self.track_gn].keys():
app = self[self.app_ds].value
app = self[self.app_ds][()]
if isinstance(app, np.ndarray):
return app
else:
Expand All @@ -308,7 +308,7 @@ def update_appearances( self, app_singlets ):

def get_appearance_energies( self ):
if _path.basename(self.app_ener_ds) in self[self.track_gn].keys():
e = self[self.app_ener_ds].value
e = self[self.app_ener_ds][()]
if isinstance(e, np.ndarray):
return e
else:
Expand Down Expand Up @@ -336,7 +336,7 @@ def rm_disappearance( self, id ):

def get_ids( self ):
features_group = self[self.feat_gn]
labelcontent = features_group["labelcontent"].value
labelcontent = features_group["labelcontent"][()]
valid_labels = (np.arange(len(labelcontent))+1)[labelcontent==1]
return valid_labels

Expand Down Expand Up @@ -365,16 +365,16 @@ def cTraxels( self, as_python_list=False, prediction_threshold=None ):
def _cTraxels_from_objects_group( self , as_python_list = False, prediction_threshold=None):
objects_g = self["objects"]
features_g = self["objects/features"]
ids = objects_g["meta/id"].value
valid = objects_g["meta/valid"].value
ids = objects_g["meta/id"][()]
valid = objects_g["meta/valid"][()]
prediction = None
if "prediction" in objects_g["meta"]:
prediction = objects_g["meta/prediction"]
elif prediction_threshold:
raise Exception("prediction_threshold set, but no prediction dataset found")
features = {}
for name in features_g.keys():
features[name] = features_g[name].value
features[name] = features_g[name][()]

if as_python_list:
ts = list()
Expand Down Expand Up @@ -404,7 +404,7 @@ def _cTraxels_from_objects_group( self , as_python_list = False, prediction_thre

def _cTraxels_from_features_group( self ):
features_group = self[self.feat_gn]
labelcontent = features_group["labelcontent"].value
labelcontent = features_group["labelcontent"][()]
invalid_labels = (np.arange(len(labelcontent))+1)[labelcontent==0]

# note, that we used the ctracklet_from_labelgroup() here before, but had
Expand Down
4 changes: 2 additions & 2 deletions empryonic/learning/optimal_matching.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,8 +135,8 @@ def _formulate_associations( graph, solved_ilp_variables ):
for id, var in solved_ilp_variables.items():
if var.value() == 1:
match = graph.edges[id]
lhs = graph.lhs[match.id_lhs].value
rhs = graph.rhs[match.id_rhs].value
lhs = graph.lhs[match.id_lhs][()]
rhs = graph.rhs[match.id_rhs][()]
if lhs != None:
assoc['lhs'][lhs] = rhs
if rhs != None:
Expand Down
Loading