Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix external links, update GH Actions workflow #165

Merged
merged 4 commits into from
Jul 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 9 additions & 4 deletions .github/workflows/test_external_links.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,23 @@
name: Testing External Links
on: pull_request
on:
schedule:
- cron: '0 5 * * 0' # every Sunday at 05:00 UTC (midnight EST)
pull_request:
workflow_dispatch:

jobs:
build-and-test:
name: Testing External Links
runs-on: ubuntu-latest
strategy:
fail-fast: false
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- run: git fetch --prune --unshallow --tags
- name: Setup Python
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: 3.9
python-version: 3.11
- name: Install sphinx Dependencies
run: pip install -r requirements.txt
- name: Test External Links
Expand Down
9 changes: 2 additions & 7 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@
version: 2

build:
os: ubuntu-20.04
os: ubuntu-22.04
tools:
python: '3.8'
python: '3.11'

# Build documentation in the docs/ directory with Sphinx
sphinx:
Expand All @@ -25,8 +25,3 @@ formats: all
python:
install:
- requirements: requirements.txt

# Optionally include all submodules
submodules:
include: all
recursive: true
4 changes: 2 additions & 2 deletions docs/source/community_gallery/community_gallery.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Data Conversion
:class: align-left, no-scaled-link
:width: 100

:neuroconv-docs:`NeuroConv Catalog <catalogue/catalogue.html>` is a collection of real-world examples of labs using :neuroconv-docs:`NeuroConv <>` to convert their data to NWB files. Each project listed contains a description and a link to an open GitHub repository. Many of the projects listed use advanced customization features beyond what is demonstrated in the core :neuroconv-docs:`NeuroConv conversion gallery <conversion_examples_gallery/conversion_example_gallery.html>`. :bdg-link-primary:`NeuroConv Catalog <https://neuroconv.readthedocs.io/en/main/catalogue/catalogue.html>`
:neuroconv-docs:`NeuroConv Catalog <catalogue/index.html>` is a collection of real-world examples of labs using :neuroconv-docs:`NeuroConv <>` to convert their data to NWB files. Each project listed contains a description and a link to an open GitHub repository. Many of the projects listed use advanced customization features beyond what is demonstrated in the core :neuroconv-docs:`NeuroConv conversion gallery <conversion_examples_gallery/index.html>`. :bdg-link-primary:`NeuroConv Catalog <https://neuroconv.readthedocs.io/en/main/catalogue/index.html>`

.. image:: figures/bristol_neuroscience_data_guide.png
:class: align-left, no-scaled-link
Expand All @@ -41,7 +41,7 @@ Data Analysis and Reuse
.. raw:: html

<br/>

In addition, the neuroscience community is creating examples demonstrating the reuse of NWB data published on DANDI. For example:

* The `INCF working group on NWB <https://www.incf.org/sig/incf-working-group-nwb>`_ has created a `library of MATLAB examples <https://github.com/INCF/example-live-scripts>`_ using DANDI datasets authored as MATLAB live scripts. :bdg-link-primary:`Source <https://github.com/INCF/example-live-scripts>`
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf_extlinks.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
'nwb-helpdesk': ('https://github.com/dandi/helpdesk/discussions/%s', '%s'),
'nwb-mailing-list': ('https://mailchi.mp/fe2a9bc55a1a/nwb-signup/%s', '%s'),
'datajoint': ('https://www.datajoint.com/%s', '%s'),
'datajoint-elements': ('https://www.datajoint.com/elements/%s', '%s'),
'datajoint-elements': ('https://datajoint.com/docs/elements/%s', '%s'),
'openscope-databook': ('https://alleninstitute.github.io/openscope_databook/%s', '%s'),
'openscope-project': ('https://alleninstitute.org/what-we-do/brain-science/research/mindscope-program/openscope/%s',
'%s'),
Expand Down
4 changes: 2 additions & 2 deletions docs/source/conversion_tutorial/02_format_conversions.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@

[NeuroConv](https://neuroconv.readthedocs.io/) is a library for automatic conversions from proprietary formats to NWB.
A gallery of all supported formats can be found
[here](https://neuroconv.readthedocs.io/en/main/conversion_examples_gallery/conversion_example_gallery.html).
[here](https://neuroconv.readthedocs.io/en/main/conversion_examples_gallery/index.html).
If NeuroConv supports your source data format, this is the recommended approach, because it is easiest to
implement and automatically helps you adhere to best practices. For advanced usage of NeuroConv, including creating
conversions of ensembles of multiple data streams, see the
[NeuroConv User Guide](https://neuroconv.readthedocs.io/en/main/user_guide/user_guide.html).
[NeuroConv User Guide](https://neuroconv.readthedocs.io/en/main/user_guide/index.html).

Although NeuroConv supports many common formats, it may not support every type of source data you need.
If your source format is likely to be a common need in the community, for example the output of an acquisition
Expand Down
2 changes: 1 addition & 1 deletion docs/source/tools/datajoint/datajoint.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@ DataJoint

.. short_description_start
:ref:`analysistools-datajoint` is an open-source project for defining and operating computational data pipelines—sequences of steps for data acquisition, processing, and transformation. Some `DataJoint Elements <https://datajoint.com/elements>`_ support automatic conversion to NWB :bdg-link-primary:`Export element_array_ephys to NWB <https://github.com/datajoint/element-array-ephys/blob/main/element_array_ephys/export/nwb/README.md>`
:ref:`analysistools-datajoint` is an open-source project for defining and operating computational data pipelines—sequences of steps for data acquisition, processing, and transformation. Some :datajoint-elements:`DataJoint Elements <>` support automatic conversion to NWB :bdg-link-primary:`Export element_array_ephys to NWB <https://github.com/datajoint/element-array-ephys/blob/main/element_array_ephys/export/nwb/README.md>`

.. short_description_end
2 changes: 1 addition & 1 deletion docs/source/tools/extract/extract.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ which *extracts* the activities of cells as time series from both one-photon and
EXTRACT makes minimal assumptions about the data, which is the main reason behind its high robustness and superior
performance.
:bdg-link-primary:`Source <https://github.com/schnitzer-lab/EXTRACT-public>`
:bdg-link-primary:`NWB tutorials <https://github.com/schnitzer-lab/EXTRACT-public/tree/master/Learning%20materials/NWB%20Tutorials>`
:bdg-link-primary:`NWB demos <https://github.com/schnitzer-lab/EXTRACT-public/tree/master/Learning%20materials/Additional%20demos/NWB%20Demos>`
:bdg-link-primary:`Publication <https://www.biorxiv.org/content/10.1101/2021.03.24.436279v2>`

.. image:: https://img.shields.io/github/stars/schnitzer-lab/EXTRACT-public?style=social
Expand Down
2 changes: 1 addition & 1 deletion docs/source/tools/pynapple/pynapple.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ pynapple
.. short_description_start

:ref:`analysistools-pynapple` is a unified toolbox for integrated analysis of multiple data sources. Designed to be "plug & play", users define and import their own time-relevant variables. Supported data sources include, but are not limited to, electrophysiology, calcium imaging, and motion capture data. Pynapple contains integrated functions for common neuroscience analyses, including cross-correlograms, tuning curves, decoding and perievent time histogram.
:bdg-link-primary:`Docs <https://pynapple-org.github.io/pynapple/>` :bdg-link-primary:`DANDI Demo <https://pynapple-org.github.io/pynapple/generated/gallery/tutorial_pynapple_dandi/>` :bdg-link-primary:`Source <https://github.com/pynapple-org/pynapple>` :bdg-link-primary:`Twitter <https://twitter.com/thepynapple>`
:bdg-link-primary:`Docs <https://pynapple-org.github.io/pynapple/>` :bdg-link-primary:`DANDI Demo <https://pynapple-org.github.io/pynapple/generated/examples/tutorial_pynapple_dandi/>` :bdg-link-primary:`Source <https://github.com/pynapple-org/pynapple>` :bdg-link-primary:`Twitter <https://twitter.com/thepynapple>`

.. image:: https://img.shields.io/github/stars/pynapple-org/pynapple?style=social
:alt: GitHub Repo stars for pynapple
Expand Down