From 0cc646f1533d7916549d7bfc560012dc021c33aa Mon Sep 17 00:00:00 2001
From: Dref360
Date: Sun, 19 Jun 2022 16:08:29 -0400
Subject: [PATCH 01/10] Add structure for mkdocs instead of Sphinx
---
.github/ISSUE_TEMPLATE/bug_report.md | 4 +-
CONTRIBUTING.md | 2 +-
README.md | 6 +-
.../dirichlet_calibration.md | 0
docs/{reports => blog}/double_descend.md | 0
docs/{reports => blog}/images/ALL_active.png | Bin
.../images/BALDvsCBALD_active.png | Bin
docs/{reports => blog}/images/CBALDvsBALD.png | Bin
.../images/CBALDvsBALDECE.png | Bin
.../images/EntvsCEnt_active.png | Bin
.../images/dirichlet_calib.png | Bin
.../images/doubledescend_01.png | Bin
.../images/doubledescend_02.png | Bin
.../images/doubledescend_03.png | Bin
.../images/doubledescend_04.png | Bin
docs/conf.py | 231 ----
docs/index.md | 41 +-
docs/industry/index.md | 0
docs/{tutorials => industry}/label-studio.md | 0
docs/research/index.md | 2 +
.../literature/Additional papers/dmi.md | 0
.../literature/Additional papers/duq.md | 0
.../literature/Additional papers/gyolov3.md | 0
.../Additional papers/lightcoresets.md | 0
.../Additional papers/sparse_selection.md | 0
.../literature/Additional papers/vaal.md | 0
docs/{ => research}/literature/core-papers.md | 0
.../literature/images/Baalscheme.svg | 0
.../literature/images/GYOLOV3/fig1.png | Bin
.../literature/images/GYOLOV3/fig2.png | Bin
.../literature/images/GYOLOV3/fig3.png | Bin
.../literature/images/dmi/fig3.png | Bin
.../experiment_results/iterations_mcdc.png | Bin
.../literature/images/lightcoreset/q_func.png | Bin
.../literature/images/logo_original.png | Bin
.../literature/images/repo_logo_25.jpg | Bin
.../images/repo_logo_25_no_corner.svg | 0
.../images/sparse_selection/eq4.png | Bin
.../images/sparse_selection/fig4.png | Bin
.../literature/images/vaal/fig1.png | Bin
.../literature/images/vaal/fig2.png | Bin
docs/{ => research}/literature/index.md | 0
docs/{ => research}/literature/more_papers.md | 0
docs/{ => support}/faq.md | 2 +-
docs/support/index.md | 2 +
docs/user_guide/index.md | 2 +-
mkdocs.yml | 50 +
poetry.lock | 1128 +++++------------
pyproject.toml | 19 +-
49 files changed, 367 insertions(+), 1122 deletions(-)
rename docs/{reports => blog}/dirichlet_calibration.md (100%)
rename docs/{reports => blog}/double_descend.md (100%)
rename docs/{reports => blog}/images/ALL_active.png (100%)
rename docs/{reports => blog}/images/BALDvsCBALD_active.png (100%)
rename docs/{reports => blog}/images/CBALDvsBALD.png (100%)
rename docs/{reports => blog}/images/CBALDvsBALDECE.png (100%)
rename docs/{reports => blog}/images/EntvsCEnt_active.png (100%)
rename docs/{reports => blog}/images/dirichlet_calib.png (100%)
rename docs/{reports => blog}/images/doubledescend_01.png (100%)
rename docs/{reports => blog}/images/doubledescend_02.png (100%)
rename docs/{reports => blog}/images/doubledescend_03.png (100%)
rename docs/{reports => blog}/images/doubledescend_04.png (100%)
delete mode 100644 docs/conf.py
create mode 100644 docs/industry/index.md
rename docs/{tutorials => industry}/label-studio.md (100%)
create mode 100644 docs/research/index.md
rename docs/{ => research}/literature/Additional papers/dmi.md (100%)
rename docs/{ => research}/literature/Additional papers/duq.md (100%)
rename docs/{ => research}/literature/Additional papers/gyolov3.md (100%)
rename docs/{ => research}/literature/Additional papers/lightcoresets.md (100%)
rename docs/{ => research}/literature/Additional papers/sparse_selection.md (100%)
rename docs/{ => research}/literature/Additional papers/vaal.md (100%)
rename docs/{ => research}/literature/core-papers.md (100%)
rename docs/{ => research}/literature/images/Baalscheme.svg (100%)
rename docs/{ => research}/literature/images/GYOLOV3/fig1.png (100%)
rename docs/{ => research}/literature/images/GYOLOV3/fig2.png (100%)
rename docs/{ => research}/literature/images/GYOLOV3/fig3.png (100%)
rename docs/{ => research}/literature/images/dmi/fig3.png (100%)
rename docs/{ => research}/literature/images/experiment_results/iterations_mcdc.png (100%)
rename docs/{ => research}/literature/images/lightcoreset/q_func.png (100%)
rename docs/{ => research}/literature/images/logo_original.png (100%)
rename docs/{ => research}/literature/images/repo_logo_25.jpg (100%)
rename docs/{ => research}/literature/images/repo_logo_25_no_corner.svg (100%)
rename docs/{ => research}/literature/images/sparse_selection/eq4.png (100%)
rename docs/{ => research}/literature/images/sparse_selection/fig4.png (100%)
rename docs/{ => research}/literature/images/vaal/fig1.png (100%)
rename docs/{ => research}/literature/images/vaal/fig2.png (100%)
rename docs/{ => research}/literature/index.md (100%)
rename docs/{ => research}/literature/more_papers.md (100%)
rename docs/{ => support}/faq.md (98%)
create mode 100644 docs/support/index.md
create mode 100644 mkdocs.yml
diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md
index b09b679a..332ac4a7 100644
--- a/.github/ISSUE_TEMPLATE/bug_report.md
+++ b/.github/ISSUE_TEMPLATE/bug_report.md
@@ -15,7 +15,7 @@ A clear and concise description of what the bug is.
Some advices:
* Please don't use custom data or custom paths.
* Use random arrays or even np.ones, np.zeros.
-* The example should run with BaaL (and deps) alone.
+* The example should run with Baal (and deps) alone.
* Should be Python3 compatible.
* Should not be OS specific.
* The file should reproduce the bug with **high* fidelity.
@@ -29,7 +29,7 @@ A clear and concise description of what you expected to happen.
**Vesion (please complete the following information):**
- OS:
- Python:
- - BaaL version:
+ - Baal version:
**Additional context**
Add any other context about the problem here.
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 8afb10f0..5e87b0a4 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,4 +1,4 @@
-## How can you contribute to BaaL.
+## How can you contribute to Baal.
### Bug fixes
diff --git a/README.md b/README.md
index 1d264f6e..a03ea2bb 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
-
Bayesian Active Learning (BaaL)
+
-BaaL is an active learning library developed at
+Baal is an active learning library developed at
[ElementAI](https://www.elementai.com/). This repository contains techniques
and reusable components to make active learning accessible for all.
@@ -88,7 +88,7 @@ The framework consists of four main parts, as demonstrated in the flowchart belo
- ActiveLearningLoop
-
+
To get started, wrap your dataset in our _[**ActiveLearningDataset**](baal/active/dataset.py)_ class. This will ensure that the dataset is split into
diff --git a/docs/reports/dirichlet_calibration.md b/docs/blog/dirichlet_calibration.md
similarity index 100%
rename from docs/reports/dirichlet_calibration.md
rename to docs/blog/dirichlet_calibration.md
diff --git a/docs/reports/double_descend.md b/docs/blog/double_descend.md
similarity index 100%
rename from docs/reports/double_descend.md
rename to docs/blog/double_descend.md
diff --git a/docs/reports/images/ALL_active.png b/docs/blog/images/ALL_active.png
similarity index 100%
rename from docs/reports/images/ALL_active.png
rename to docs/blog/images/ALL_active.png
diff --git a/docs/reports/images/BALDvsCBALD_active.png b/docs/blog/images/BALDvsCBALD_active.png
similarity index 100%
rename from docs/reports/images/BALDvsCBALD_active.png
rename to docs/blog/images/BALDvsCBALD_active.png
diff --git a/docs/reports/images/CBALDvsBALD.png b/docs/blog/images/CBALDvsBALD.png
similarity index 100%
rename from docs/reports/images/CBALDvsBALD.png
rename to docs/blog/images/CBALDvsBALD.png
diff --git a/docs/reports/images/CBALDvsBALDECE.png b/docs/blog/images/CBALDvsBALDECE.png
similarity index 100%
rename from docs/reports/images/CBALDvsBALDECE.png
rename to docs/blog/images/CBALDvsBALDECE.png
diff --git a/docs/reports/images/EntvsCEnt_active.png b/docs/blog/images/EntvsCEnt_active.png
similarity index 100%
rename from docs/reports/images/EntvsCEnt_active.png
rename to docs/blog/images/EntvsCEnt_active.png
diff --git a/docs/reports/images/dirichlet_calib.png b/docs/blog/images/dirichlet_calib.png
similarity index 100%
rename from docs/reports/images/dirichlet_calib.png
rename to docs/blog/images/dirichlet_calib.png
diff --git a/docs/reports/images/doubledescend_01.png b/docs/blog/images/doubledescend_01.png
similarity index 100%
rename from docs/reports/images/doubledescend_01.png
rename to docs/blog/images/doubledescend_01.png
diff --git a/docs/reports/images/doubledescend_02.png b/docs/blog/images/doubledescend_02.png
similarity index 100%
rename from docs/reports/images/doubledescend_02.png
rename to docs/blog/images/doubledescend_02.png
diff --git a/docs/reports/images/doubledescend_03.png b/docs/blog/images/doubledescend_03.png
similarity index 100%
rename from docs/reports/images/doubledescend_03.png
rename to docs/blog/images/doubledescend_03.png
diff --git a/docs/reports/images/doubledescend_04.png b/docs/blog/images/doubledescend_04.png
similarity index 100%
rename from docs/reports/images/doubledescend_04.png
rename to docs/blog/images/doubledescend_04.png
diff --git a/docs/conf.py b/docs/conf.py
deleted file mode 100644
index 14c88552..00000000
--- a/docs/conf.py
+++ /dev/null
@@ -1,231 +0,0 @@
-# -*- coding: utf-8 -*-
-#
-# Configuration file for the Sphinx documentation builder.
-#
-# This file does only contain a selection of the most common options. For a
-# full list see the documentation:
-# http://www.sphinx-doc.org/en/master/config
-
-# -- Path setup --------------------------------------------------------------
-
-# If extensions (or modules to document with autodoc) are in another directory,
-# add these directories to sys.path here. If the directory is relative to the
-# documentation root, use os.path.abspath to make it absolute, like shown here.
-#
-import os
-import pathlib
-import shutil
-import sys
-
-from recommonmark.transform import AutoStructify
-import sphinx_rtd_theme
-from recommonmark.parser import CommonMarkParser
-
-pjoin = os.path.join
-parent_dir = pathlib.Path(__file__).resolve().parents[1]
-sys.path.insert(0, os.path.abspath('./../'))
-
-shutil.rmtree('notebooks', ignore_errors=True)
-shutil.copytree(pjoin(parent_dir, 'notebooks'), 'notebooks')
-
-# -- Project information -----------------------------------------------------
-
-# Disable notebook execution
-nbsphinx_execute = 'never'
-
-project = 'baal'
-copyright = '2019, Parmida Atighehchian, Frédéric Branchaud-Charron, Jan Freyberg'
-author = 'Parmida Atighehchian, Frédéric Branchaud-Charron, Jan Freyberg'
-
-# The short X.Y version
-version = ''
-# The full version, including alpha/beta/rc tags
-release = '1.6.0'
-
-# -- General configuration ---------------------------------------------------
-
-# If your documentation needs a minimal Sphinx version, state it here.
-#
-# needs_sphinx = '1.0'
-
-# Add any Sphinx extension module names here, as strings. They can be
-# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
-# ones.
-extensions = [
- 'sphinx.ext.autodoc',
- 'sphinx.ext.autosummary',
- 'sphinx.ext.doctest',
- 'sphinx.ext.todo',
- 'sphinx.ext.mathjax',
- 'sphinx.ext.viewcode',
- 'sphinx_copybutton',
- "sphinx_automodapi.automodapi",
- "nbsphinx",
- "recommonmark",
- "numpydoc",
- "sphinx.ext.napoleon"
-]
-
-# We need to mock these packages to compile without deps.
-autodoc_mock_imports = ["PIL", "tqdm", "structlog", "torch", "torchvision", "numpy", "sklearn",
- "scipy", "baal.utils.cuda_utils", "transformers", "pytorch_lightning",
- "datasets"]
-
-# Add any paths that contain templates here, relative to this directory.
-templates_path = ['_templates']
-
-
-
-source_parsers = {
- '.md': CommonMarkParser,
-}
-
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md', '.ipynb']
-
-# The master toctree document.
-master_doc = 'index'
-
-# The language for content autogenerated by Sphinx. Refer to documentation
-# for a list of supported languages.
-#
-# This is also used if you do content translation via gettext catalogs.
-# Usually you set "language" from the command line for these cases.
-language = None
-
-# List of patterns, relative to source directory, that match files and
-# directories to ignore when looking for source files.
-# This pattern also affects html_static_path and html_extra_path.
-exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store', '.ipynb_checkpoints']
-
-# The name of the Pygments (syntax highlighting) style to use.
-pygments_style = None
-
-# -- Options for HTML output -------------------------------------------------
-
-# The theme to use for HTML and HTML Help pages. See the documentation for
-# a list of builtin themes.
-#
-html_theme = "sphinx_rtd_theme"
-html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
-html_logo = "_static/images/logo-transparent.png"
-
-# Theme options are theme-specific and customize the look and feel of a theme
-# further. For a list of options available for each theme, see the
-# documentation.
-#
-# html_theme_options = {}
-
-# Add any paths that contain custom static files (such as style sheets) here,
-# relative to this directory. They are copied after the builtin static files,
-# so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ['_static', '_static/images']
-html_css_files = [
- 'css/default.css',
-]
-
-# Custom sidebar templates, must be a dictionary that maps document names
-# to template names.
-#
-# The default sidebars (for documents that don't match any pattern) are
-# defined by theme itself. Builtin themes are using these templates by
-# default: ``['localtoc.html', 'relations.html', 'sourcelink.html',
-# 'searchbox.html']``.
-#
-# html_sidebars = {}
-
-
-# -- Options for HTMLHelp output ---------------------------------------------
-
-# Output file base name for HTML help builder.
-htmlhelp_basename = 'baaldoc'
-
-# -- Options for LaTeX output ------------------------------------------------
-
-latex_elements = {
- # The paper size ('letterpaper' or 'a4paper').
- #
- # 'papersize': 'letterpaper',
- # The font size ('10pt', '11pt' or '12pt').
- #
- # 'pointsize': '10pt',
- # Additional stuff for the LaTeX preamble.
- #
- # 'preamble': '',
- # Latex figure (float) alignment
- #
- # 'figure_align': 'htbp',
-}
-
-# Grouping the document tree into LaTeX files. List of tuples
-# (source start file, target name, title,
-# author, documentclass [howto, manual, or own class]).
-latex_documents = [
- (
- master_doc,
- 'baal.tex',
- 'baal Documentation',
- 'Parmida Atighehchian, Frédéric Branchaud-Charron, Jan Freyberg',
- 'manual',
- )
-]
-
-# -- Options for manual page output ------------------------------------------
-
-# One entry per manual page. List of tuples
-# (source start file, name, description, authors, manual section).
-man_pages = [(master_doc, 'baal', 'baal Documentation', [author], 1)]
-
-# -- Options for Texinfo output ----------------------------------------------
-
-# Grouping the document tree into Texinfo files. List of tuples
-# (source start file, target name, title, author,
-# dir menu entry, description, category)
-texinfo_documents = [
- (
- master_doc,
- 'baal',
- 'baal Documentation',
- author,
- 'baal',
- 'One line description of project.',
- 'Miscellaneous',
- )
-]
-
-# -- Options for Epub output -------------------------------------------------
-
-# Bibliographic Dublin Core info.
-epub_title = project
-
-# The unique identifier of the text. This can be a ISBN number
-# or the project homepage.
-#
-# epub_identifier = ''
-
-# A unique identification for the text.
-#
-# epub_uid = ''
-
-# A list of files that should not be packed into the epub file.
-epub_exclude_files = ['search.html']
-
-# -- Extension configuration -------------------------------------------------
-
-# -- Options for todo extension ----------------------------------------------
-
-# If true, `todo` and `todoList` produce output, else they produce nothing.
-todo_include_todos = True
-
-
-# At the bottom of conf.py
-def setup(app):
- app.add_config_value('recommonmark_config', {
- 'enable_auto_toc_tree': True,
- 'enable_eval_rst': True,
- 'enable_math': True,
- 'enable_inline_math': True,
- 'auto_toc_tree_section': 'Contents',
- }, True)
- app.add_transform(AutoStructify)
diff --git a/docs/index.md b/docs/index.md
index 30c794f5..12cbc38e 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -1,16 +1,6 @@
-```eval_rst
-.. baal documentation master file, created by
- sphinx-quickstart on Thu Apr 4 14:15:25 2019.
- You can adapt this file completely to your liking, but it should at least
- contain the root `toctree` directive.
-```
-
-# Welcome to the documentation for baal (**ba**yesian **a**ctive **l**earning)
+# Welcome to Baal (**ba**yesian **a**ctive **l**earning)
-
-Star
-
-BaaL is a Bayesian active learning library.
+Baal is a Bayesian active learning library.
We provide methods to estimate sampling from the posterior distribution
in order to maximize the efficiency of labelling during active learning. Our library is suitable for research and industrial applications.
@@ -23,20 +13,20 @@ If you have any question, we are reachable on [Slack](https://join.slack.com/t/b
For support, we have several ways to help you:
-* Our [FAQ](faq.md)
+* Our [FAQ](support/faq.md)
* Submit an issue on Github [here](https://github.com/ElementAI/baal/issues/new/choose)
* Join our [Slack](https://join.slack.com/t/baal-world/shared_invite/zt-z0izhn4y-Jt6Zu5dZaV2rsAS9sdISfg)!
-```eval_rst
-.. toctree::
- :caption: Learn more about BaaL
- :maxdepth: 1
+## :material-file-tree: Learn more about Baal
+
+* [:material-link: User Guide](user_guide)
+* [:material-book-education: Active learning dataset and training loop classes](notebooks/fundamentals/active-learning.ipynb)
+* [:material-book-education: Methods for approximating bayesian posteriors](notebooks/fundamentals/posteriors.ipynb)
+* [:material-link: API Index](api)
+* [:material-help: FAQ](support/faq.md)
- User guide
- Active learning dataset and training loop classes
- Methods for approximating bayesian posteriors
- API Index
- FAQ
+## :material-file-tree: Industry
+* [:material-book-education: Active learning dataset and training loop classes](notebooks/fundamentals/active-learning.ipynb)
.. toctree ::
:caption: Tutorials
@@ -70,10 +60,3 @@ For support, we have several ways to help you:
Background literature
Cheat Sheet
```
-
-## Indices and tables
-
-```eval_rst
-* :ref:`genindex`
-* :ref:`search`
-```
diff --git a/docs/industry/index.md b/docs/industry/index.md
new file mode 100644
index 00000000..e69de29b
diff --git a/docs/tutorials/label-studio.md b/docs/industry/label-studio.md
similarity index 100%
rename from docs/tutorials/label-studio.md
rename to docs/industry/label-studio.md
diff --git a/docs/research/index.md b/docs/research/index.md
new file mode 100644
index 00000000..64473875
--- /dev/null
+++ b/docs/research/index.md
@@ -0,0 +1,2 @@
+# Bayesian active learning research
+
diff --git a/docs/literature/Additional papers/dmi.md b/docs/research/literature/Additional papers/dmi.md
similarity index 100%
rename from docs/literature/Additional papers/dmi.md
rename to docs/research/literature/Additional papers/dmi.md
diff --git a/docs/literature/Additional papers/duq.md b/docs/research/literature/Additional papers/duq.md
similarity index 100%
rename from docs/literature/Additional papers/duq.md
rename to docs/research/literature/Additional papers/duq.md
diff --git a/docs/literature/Additional papers/gyolov3.md b/docs/research/literature/Additional papers/gyolov3.md
similarity index 100%
rename from docs/literature/Additional papers/gyolov3.md
rename to docs/research/literature/Additional papers/gyolov3.md
diff --git a/docs/literature/Additional papers/lightcoresets.md b/docs/research/literature/Additional papers/lightcoresets.md
similarity index 100%
rename from docs/literature/Additional papers/lightcoresets.md
rename to docs/research/literature/Additional papers/lightcoresets.md
diff --git a/docs/literature/Additional papers/sparse_selection.md b/docs/research/literature/Additional papers/sparse_selection.md
similarity index 100%
rename from docs/literature/Additional papers/sparse_selection.md
rename to docs/research/literature/Additional papers/sparse_selection.md
diff --git a/docs/literature/Additional papers/vaal.md b/docs/research/literature/Additional papers/vaal.md
similarity index 100%
rename from docs/literature/Additional papers/vaal.md
rename to docs/research/literature/Additional papers/vaal.md
diff --git a/docs/literature/core-papers.md b/docs/research/literature/core-papers.md
similarity index 100%
rename from docs/literature/core-papers.md
rename to docs/research/literature/core-papers.md
diff --git a/docs/literature/images/Baalscheme.svg b/docs/research/literature/images/Baalscheme.svg
similarity index 100%
rename from docs/literature/images/Baalscheme.svg
rename to docs/research/literature/images/Baalscheme.svg
diff --git a/docs/literature/images/GYOLOV3/fig1.png b/docs/research/literature/images/GYOLOV3/fig1.png
similarity index 100%
rename from docs/literature/images/GYOLOV3/fig1.png
rename to docs/research/literature/images/GYOLOV3/fig1.png
diff --git a/docs/literature/images/GYOLOV3/fig2.png b/docs/research/literature/images/GYOLOV3/fig2.png
similarity index 100%
rename from docs/literature/images/GYOLOV3/fig2.png
rename to docs/research/literature/images/GYOLOV3/fig2.png
diff --git a/docs/literature/images/GYOLOV3/fig3.png b/docs/research/literature/images/GYOLOV3/fig3.png
similarity index 100%
rename from docs/literature/images/GYOLOV3/fig3.png
rename to docs/research/literature/images/GYOLOV3/fig3.png
diff --git a/docs/literature/images/dmi/fig3.png b/docs/research/literature/images/dmi/fig3.png
similarity index 100%
rename from docs/literature/images/dmi/fig3.png
rename to docs/research/literature/images/dmi/fig3.png
diff --git a/docs/literature/images/experiment_results/iterations_mcdc.png b/docs/research/literature/images/experiment_results/iterations_mcdc.png
similarity index 100%
rename from docs/literature/images/experiment_results/iterations_mcdc.png
rename to docs/research/literature/images/experiment_results/iterations_mcdc.png
diff --git a/docs/literature/images/lightcoreset/q_func.png b/docs/research/literature/images/lightcoreset/q_func.png
similarity index 100%
rename from docs/literature/images/lightcoreset/q_func.png
rename to docs/research/literature/images/lightcoreset/q_func.png
diff --git a/docs/literature/images/logo_original.png b/docs/research/literature/images/logo_original.png
similarity index 100%
rename from docs/literature/images/logo_original.png
rename to docs/research/literature/images/logo_original.png
diff --git a/docs/literature/images/repo_logo_25.jpg b/docs/research/literature/images/repo_logo_25.jpg
similarity index 100%
rename from docs/literature/images/repo_logo_25.jpg
rename to docs/research/literature/images/repo_logo_25.jpg
diff --git a/docs/literature/images/repo_logo_25_no_corner.svg b/docs/research/literature/images/repo_logo_25_no_corner.svg
similarity index 100%
rename from docs/literature/images/repo_logo_25_no_corner.svg
rename to docs/research/literature/images/repo_logo_25_no_corner.svg
diff --git a/docs/literature/images/sparse_selection/eq4.png b/docs/research/literature/images/sparse_selection/eq4.png
similarity index 100%
rename from docs/literature/images/sparse_selection/eq4.png
rename to docs/research/literature/images/sparse_selection/eq4.png
diff --git a/docs/literature/images/sparse_selection/fig4.png b/docs/research/literature/images/sparse_selection/fig4.png
similarity index 100%
rename from docs/literature/images/sparse_selection/fig4.png
rename to docs/research/literature/images/sparse_selection/fig4.png
diff --git a/docs/literature/images/vaal/fig1.png b/docs/research/literature/images/vaal/fig1.png
similarity index 100%
rename from docs/literature/images/vaal/fig1.png
rename to docs/research/literature/images/vaal/fig1.png
diff --git a/docs/literature/images/vaal/fig2.png b/docs/research/literature/images/vaal/fig2.png
similarity index 100%
rename from docs/literature/images/vaal/fig2.png
rename to docs/research/literature/images/vaal/fig2.png
diff --git a/docs/literature/index.md b/docs/research/literature/index.md
similarity index 100%
rename from docs/literature/index.md
rename to docs/research/literature/index.md
diff --git a/docs/literature/more_papers.md b/docs/research/literature/more_papers.md
similarity index 100%
rename from docs/literature/more_papers.md
rename to docs/research/literature/more_papers.md
diff --git a/docs/faq.md b/docs/support/faq.md
similarity index 98%
rename from docs/faq.md
rename to docs/support/faq.md
index cd8269da..58d99618 100644
--- a/docs/faq.md
+++ b/docs/support/faq.md
@@ -135,7 +135,7 @@ active_dataset.label(ranks, labels)
Bayesian active learning is a relatively small field with a lot of unknowns. This section aims at presenting some of our
findings so that newcomers can quickly learn.
-Don't forget to look at our [literature review](../literature/index.md) for a good introduction to the field.
+Don't forget to look at our [literature review](../research/literature/index.md) for a good introduction to the field.
### Should you use early stopping?
diff --git a/docs/support/index.md b/docs/support/index.md
new file mode 100644
index 00000000..852c875b
--- /dev/null
+++ b/docs/support/index.md
@@ -0,0 +1,2 @@
+# Support
+
diff --git a/docs/user_guide/index.md b/docs/user_guide/index.md
index e60af564..4fc2a264 100644
--- a/docs/user_guide/index.md
+++ b/docs/user_guide/index.md
@@ -53,7 +53,7 @@ We hope that work in this area continues so that we can better understand the im
**Resources**
-* [Literature review](../literature/index.md)
+* [Literature review](../research/literature/index.md)
* [Active learning dataset and training loop classes](../notebooks/fundamentals/active-learning)
* [Methods for approximating bayesian posteriors](../notebooks/fundamentals/posteriors)
* [Full active learning example](../notebooks/active_learning_process)
diff --git a/mkdocs.yml b/mkdocs.yml
new file mode 100644
index 00000000..6b6a4c26
--- /dev/null
+++ b/mkdocs.yml
@@ -0,0 +1,50 @@
+site_name: Baal Documentation
+repo_url: https://github.com/baal-org/baal
+edit_uri: edit/master/docs/
+theme:
+ name: material
+ palette:
+ # Palette toggle for light mode
+ - media: "(prefers-color-scheme: light)"
+ scheme: default
+ primary: blue grey
+ toggle:
+ icon: material/brightness-7
+ name: Switch to dark mode
+
+ # Palette toggle for dark mode
+ - media: "(prefers-color-scheme: dark)"
+ scheme: slate
+ primary: blue grey
+ toggle:
+ icon: material/brightness-4
+ name: Switch to light mode
+ features:
+ - navigation.tabs
+ - navigation.indexes
+ - navigation.instant
+ icon:
+ repo: fontawesome/brands/github
+plugins:
+ - search
+
+markdown_extensions:
+ - attr_list
+ - pymdownx.emoji:
+ emoji_index: !!python/name:materialx.emoji.twemoji
+ emoji_generator: !!python/name:materialx.emoji.to_svg
+
+nav:
+ - Home: index.md
+ - User Guide:
+ - user_guide/index.md
+ - API:
+ - api/index.md
+ - Production:
+ - industry/index.md
+ - Research:
+ - research/index.md
+ - Blog:
+ - blog/index.md
+ - Support:
+ - support/index.md
diff --git a/poetry.lock b/poetry.lock
index 7ea02d0f..970d6b6d 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -42,79 +42,6 @@ python-versions = ">=3.6"
[package.dependencies]
frozenlist = ">=1.1.0"
-[[package]]
-name = "alabaster"
-version = "0.7.12"
-description = "A configurable sidebar-enabled Sphinx theme"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[[package]]
-name = "appnope"
-version = "0.1.2"
-description = "Disable App Nap on macOS >= 10.9"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[[package]]
-name = "argcomplete"
-version = "2.0.0"
-description = "Bash tab completion for argparse"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-importlib-metadata = {version = ">=0.23,<5", markers = "python_version == \"3.7\""}
-
-[package.extras]
-test = ["coverage", "flake8", "pexpect", "wheel"]
-
-[[package]]
-name = "argon2-cffi"
-version = "21.3.0"
-description = "The secure Argon2 password hashing algorithm."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-argon2-cffi-bindings = "*"
-typing-extensions = {version = "*", markers = "python_version < \"3.8\""}
-
-[package.extras]
-dev = ["pre-commit", "cogapp", "tomli", "coverage[toml] (>=5.0.2)", "hypothesis", "pytest", "sphinx", "sphinx-notfound-page", "furo"]
-docs = ["sphinx", "sphinx-notfound-page", "furo"]
-tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pytest"]
-
-[[package]]
-name = "argon2-cffi-bindings"
-version = "21.2.0"
-description = "Low-level CFFI bindings for Argon2"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-cffi = ">=1.0.1"
-
-[package.extras]
-dev = ["pytest", "cogapp", "pre-commit", "wheel"]
-tests = ["pytest"]
-
-[[package]]
-name = "asteroid-sphinx-theme"
-version = "0.0.3"
-description = "Asteroid: Sphinx Theme"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-sphinx = "*"
-
[[package]]
name = "async-timeout"
version = "4.0.2"
@@ -156,25 +83,6 @@ docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface", "cloudpickle"]
tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "cloudpickle"]
-[[package]]
-name = "babel"
-version = "2.9.1"
-description = "Internationalization utilities"
-category = "dev"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-
-[package.dependencies]
-pytz = ">=2015.7"
-
-[[package]]
-name = "backcall"
-version = "0.2.0"
-description = "Specifications for callback functions passed in to an API"
-category = "dev"
-optional = false
-python-versions = "*"
-
[[package]]
name = "bandit"
version = "1.7.1"
@@ -204,7 +112,10 @@ pathspec = ">=0.9.0,<1"
platformdirs = ">=2"
tomli = ">=0.2.6,<2.0.0"
typed-ast = {version = ">=1.4.2", markers = "python_version < \"3.8\" and implementation_name == \"cpython\""}
-typing-extensions = ">=3.10.0.0"
+typing-extensions = [
+ {version = ">=3.10.0.0", markers = "python_version < \"3.10\""},
+ {version = "!=3.10.0.1", markers = "python_version >= \"3.10\""},
+]
[package.extras]
colorama = ["colorama (>=0.4.3)"]
@@ -292,17 +203,6 @@ category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
-[[package]]
-name = "commonmark"
-version = "0.9.1"
-description = "Python parser for the CommonMark Markdown spec"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.extras]
-test = ["flake8 (==3.7.8)", "hypothesis (==3.55.3)"]
-
[[package]]
name = "coverage"
version = "6.2"
@@ -349,32 +249,16 @@ xxhash = "*"
apache-beam = ["apache-beam (>=2.26.0)"]
audio = ["librosa"]
benchmarks = ["numpy (==1.18.5)", "tensorflow (==2.3.0)", "torch (==1.6.0)", "transformers (==3.0.2)"]
-dev = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[s3,server] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "transformers", "bs4", "conllu", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "black (==21.4b0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)", "importlib-resources"]
+dev = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[server,s3] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "transformers", "bs4", "conllu", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "black (==21.4b0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)", "importlib-resources"]
docs = ["docutils (==0.16.0)", "recommonmark", "sphinx (==3.1.2)", "sphinx-markdown-tables", "sphinx-rtd-theme (==0.4.3)", "sphinxext-opengraph (==0.4.1)", "sphinx-copybutton", "fsspec (<2021.9.0)", "s3fs", "sphinx-panels", "sphinx-inline-tabs", "myst-parser", "Markdown (!=3.3.5)"]
quality = ["black (==21.4b0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)"]
s3 = ["fsspec", "boto3", "botocore", "s3fs"]
tensorflow = ["tensorflow (>=2.2.0,!=2.6.0,!=2.6.1)"]
tensorflow_gpu = ["tensorflow-gpu (>=2.2.0,!=2.6.0,!=2.6.1)"]
-tests = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[s3,server] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "transformers", "bs4", "conllu", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "importlib-resources"]
+tests = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[server,s3] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "transformers", "bs4", "conllu", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "importlib-resources"]
torch = ["torch"]
vision = ["Pillow (>=6.2.1)"]
-[[package]]
-name = "debugpy"
-version = "1.5.1"
-description = "An implementation of the Debug Adapter Protocol for Python"
-category = "dev"
-optional = false
-python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*"
-
-[[package]]
-name = "decorator"
-version = "5.1.1"
-description = "Decorators for Humans"
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
[[package]]
name = "defusedxml"
version = "0.7.1"
@@ -489,11 +373,11 @@ python-versions = ">=3.6"
[[package]]
name = "fsspec"
-version = "2022.1.0"
+version = "2022.5.0"
description = "File-system specification"
category = "main"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.7"
[package.dependencies]
aiohttp = {version = "*", optional = true, markers = "extra == \"http\""}
@@ -520,6 +404,7 @@ s3 = ["s3fs"]
sftp = ["paramiko"]
smb = ["smbprotocol"]
ssh = ["paramiko"]
+tqdm = ["tqdm"]
[[package]]
name = "future"
@@ -529,6 +414,20 @@ category = "dev"
optional = false
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
+[[package]]
+name = "ghp-import"
+version = "2.1.0"
+description = "Copy your docs directly to the gh-pages branch."
+category = "dev"
+optional = false
+python-versions = "*"
+
+[package.dependencies]
+python-dateutil = ">=2.8.1"
+
+[package.extras]
+dev = ["twine", "markdown", "flake8", "wheel"]
+
[[package]]
name = "gitdb"
version = "4.0.9"
@@ -667,14 +566,6 @@ category = "main"
optional = false
python-versions = ">=3.5"
-[[package]]
-name = "imagesize"
-version = "1.3.0"
-description = "Getting image size from png/jpeg/jpeg2000/gif file"
-category = "dev"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-
[[package]]
name = "importlib-metadata"
version = "4.10.0"
@@ -715,61 +606,6 @@ category = "dev"
optional = false
python-versions = "*"
-[[package]]
-name = "ipykernel"
-version = "6.6.1"
-description = "IPython Kernel for Jupyter"
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[package.dependencies]
-appnope = {version = "*", markers = "platform_system == \"Darwin\""}
-argcomplete = {version = ">=1.12.3", markers = "python_version < \"3.8.0\""}
-debugpy = ">=1.0.0,<2.0"
-importlib-metadata = {version = "<5", markers = "python_version < \"3.8.0\""}
-ipython = ">=7.23.1"
-jupyter-client = "<8.0"
-matplotlib-inline = ">=0.1.0,<0.2.0"
-nest-asyncio = "*"
-tornado = ">=4.2,<7.0"
-traitlets = ">=5.1.0,<6.0"
-
-[package.extras]
-test = ["pytest (!=5.3.4)", "pytest-cov", "flaky", "ipyparallel"]
-
-[[package]]
-name = "ipython"
-version = "7.31.1"
-description = "IPython: Productive Interactive Computing"
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[package.dependencies]
-appnope = {version = "*", markers = "sys_platform == \"darwin\""}
-backcall = "*"
-colorama = {version = "*", markers = "sys_platform == \"win32\""}
-decorator = "*"
-jedi = ">=0.16"
-matplotlib-inline = "*"
-pexpect = {version = ">4.3", markers = "sys_platform != \"win32\""}
-pickleshare = "*"
-prompt-toolkit = ">=2.0.0,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.1.0"
-pygments = "*"
-traitlets = ">=4.2"
-
-[package.extras]
-all = ["Sphinx (>=1.3)", "ipykernel", "ipyparallel", "ipywidgets", "nbconvert", "nbformat", "nose (>=0.10.1)", "notebook", "numpy (>=1.17)", "pygments", "qtconsole", "requests", "testpath"]
-doc = ["Sphinx (>=1.3)"]
-kernel = ["ipykernel"]
-nbconvert = ["nbconvert"]
-nbformat = ["nbformat"]
-notebook = ["notebook", "ipywidgets"]
-parallel = ["ipyparallel"]
-qtconsole = ["qtconsole"]
-test = ["nose (>=0.10.1)", "requests", "testpath", "pygments", "nbformat", "ipykernel", "numpy (>=1.17)"]
-
[[package]]
name = "ipython-genutils"
version = "0.2.0"
@@ -778,41 +614,6 @@ category = "dev"
optional = false
python-versions = "*"
-[[package]]
-name = "ipywidgets"
-version = "7.6.5"
-description = "IPython HTML widgets for Jupyter"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-ipykernel = ">=4.5.1"
-ipython = {version = ">=4.0.0", markers = "python_version >= \"3.3\""}
-ipython-genutils = ">=0.2.0,<0.3.0"
-jupyterlab-widgets = {version = ">=1.0.0", markers = "python_version >= \"3.6\""}
-nbformat = ">=4.2.0"
-traitlets = ">=4.3.1"
-widgetsnbextension = ">=3.5.0,<3.6.0"
-
-[package.extras]
-test = ["pytest (>=3.6.0)", "pytest-cov", "mock"]
-
-[[package]]
-name = "jedi"
-version = "0.18.1"
-description = "An autocompletion tool for Python that can be used for text editors."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-parso = ">=0.8.0,<0.9.0"
-
-[package.extras]
-qa = ["flake8 (==3.8.3)", "mypy (==0.782)"]
-testing = ["Django (<3.1)", "colorama", "docopt", "pytest (<7.0.0)"]
-
[[package]]
name = "jinja2"
version = "3.0.3"
@@ -837,7 +638,7 @@ python-versions = ">=3.6"
[[package]]
name = "jsonargparse"
-version = "4.1.4"
+version = "4.9.0"
description = "Parsing of command line options, yaml/jsonnet config files and/or environment variables based on argparse."
category = "dev"
optional = false
@@ -851,11 +652,12 @@ PyYAML = ">=3.13"
all = ["docstring-parser (>=0.7.3)", "jsonschema (>=3.2.0)", "jsonnet (>=0.13.0)", "validators (>=0.14.2)", "requests (>=2.18.4)", "fsspec (>=0.8.4)", "argcomplete (>=2.0.0)", "ruyaml (>=0.20.0)", "omegaconf (>=2.1.1)", "reconplogger (>=4.4.0)", "typing-extensions (>=3.10.0.0)", "dataclasses (>=0.8)"]
argcomplete = ["argcomplete (>=2.0.0)"]
dataclasses = ["dataclasses (>=0.8)"]
-dev = ["coverage (>=4.5.1)", "responses (>=0.12.0)", "pylint (>=1.8.3)", "pycodestyle (>=2.5.0)", "mypy (>=0.701)", "bump2version (>=0.5.11)", "twine (>=3.1.1)"]
+dev = ["coverage (>=4.5.1)", "responses (>=0.12.0)", "Sphinx (>=1.7.9)", "sphinx-rtd-theme (>=0.4.3)", "autodocsumm (>=0.1.10)", "sphinx-autodoc-typehints (>=1.11.1)", "pre-commit (>=2.19.0)", "pylint (>=1.8.3)", "pycodestyle (>=2.5.0)", "mypy (>=0.701)", "tox (>=3.25.0)"]
doc = ["Sphinx (>=1.7.9)", "sphinx-rtd-theme (>=0.4.3)", "autodocsumm (>=0.1.10)", "sphinx-autodoc-typehints (>=1.11.1)"]
fsspec = ["fsspec (>=0.8.4)"]
jsonnet = ["jsonnet (>=0.13.0)"]
jsonschema = ["jsonschema (>=3.2.0)"]
+maintainer = ["bump2version (>=0.5.11)", "twine (>=4.0.0)"]
omegaconf = ["omegaconf (>=2.1.1)"]
reconplogger = ["reconplogger (>=4.4.0)"]
ruyaml = ["ruyaml (>=0.20.0)"]
@@ -917,21 +719,6 @@ python-versions = ">=3.6"
pywin32 = {version = ">=1.0", markers = "sys_platform == \"win32\" and platform_python_implementation != \"PyPy\""}
traitlets = "*"
-[[package]]
-name = "jupyter-sphinx"
-version = "0.3.2"
-description = "Jupyter Sphinx Extensions"
-category = "dev"
-optional = false
-python-versions = ">= 3.6"
-
-[package.dependencies]
-IPython = "*"
-ipywidgets = ">=7.0.0"
-nbconvert = ">=5.5"
-nbformat = "*"
-Sphinx = ">=2"
-
[[package]]
name = "jupyterlab-pygments"
version = "0.1.2"
@@ -944,12 +731,23 @@ python-versions = "*"
pygments = ">=2.4.1,<3"
[[package]]
-name = "jupyterlab-widgets"
-version = "1.0.2"
-description = "A JupyterLab extension."
+name = "jupytext"
+version = "1.13.8"
+description = "Jupyter notebooks as Markdown documents, Julia, Python or R scripts"
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = "~=3.6"
+
+[package.dependencies]
+markdown-it-py = ">=1.0.0,<3.0.0"
+mdit-py-plugins = "*"
+nbformat = "*"
+pyyaml = "*"
+toml = "*"
+
+[package.extras]
+rst2md = ["sphinx-gallery (>=0.7.0,<0.8.0)"]
+toml = ["toml"]
[[package]]
name = "kiwisolver"
@@ -994,8 +792,8 @@ test = ["codecov (>=2.1)", "pytest (>=6.0)", "pytest-cov (>2.10)", "flake8", "ch
[[package]]
name = "lightning-flash"
-version = "0.7.0rc0"
-description = "Flash is a framework for fast prototyping, finetuning, and solving most standard deep learning challenges"
+version = "0.8.0.dev0"
+description = "Your PyTorch AI Factory - Flash enables you to easily configure and run complex AI recipes."
category = "dev"
optional = false
python-versions = ">=3.6"
@@ -1010,6 +808,7 @@ numpy = "*"
packaging = "*"
pandas = ">=1.1.0"
Pillow = {version = ">=7.2", optional = true, markers = "extra == \"image\""}
+protobuf = "<=3.20.1"
pyDeprecate = "*"
pystiche = {version = ">=1.0.0,<2.0.0", optional = true, markers = "extra == \"image\""}
pytorch-lightning = ">=1.3.6"
@@ -1020,28 +819,30 @@ torchmetrics = ">=0.5.0,<0.5.1 || >0.5.1"
torchvision = {version = "*", optional = true, markers = "extra == \"image\""}
[package.extras]
-all = ["pytorch-forecasting (>=0.9.0)", "torchmetrics[text] (>=0.5.1)", "pytorchvideo (==0.1.2)", "timm (>=0.4.5)", "Pillow (>=7.2)", "pystiche (>=1.0.0,<2.0.0)", "pytorch-tabular (==0.7.0)", "transformers (>=4.5)", "librosa (>=0.8.1)", "filelock", "datasets (>=1.8)", "torchaudio", "scikit-learn", "lightning-bolts (>=0.3.3)", "kornia (>=0.5.1)", "sentencepiece (>=0.1.95)", "sentence-transformers", "transformers (>=4.13.0)", "datasets (>=1.16.1)", "torchvision", "segmentation-models-pytorch"]
-audio = ["torchaudio", "transformers (>=4.13.0)", "librosa (>=0.8.1)", "datasets (>=1.16.1)"]
-dev = ["myst-parser (>=0.15)", "pytorch-forecasting (>=0.9.0)", "torchmetrics[text] (>=0.5.1)", "ipython", "coverage", "sphinxcontrib-fulltoc (>=1.0)", "docutils (>=0.16)", "pytorchvideo (==0.1.2)", "pytest-flake8", "timm (>=0.4.5)", "Pillow (>=7.2)", "torch-optimizer", "sphinx-autodoc-typehints (>=1.0)", "pytest-rerunfailures (>=10.0)", "pytest-mock", "nbsphinx (>=0.8.5)", "pystiche (>=1.0.0,<2.0.0)", "pytorch-tabular (==0.7.0)", "sphinx-togglebutton (>=0.2)", "transformers (>=4.5)", "pre-commit", "sphinx (>=4.0,<5.0)", "librosa (>=0.8.1)", "pytest-doctestplus (>=0.9.0)", "filelock", "datasets (>=1.8)", "torchaudio", "scikit-learn", "lightning-bolts (>=0.3.3)", "kornia (>=0.5.1)", "check-manifest", "isort", "codecov (>=2.1)", "sentencepiece (>=0.1.95)", "flake8", "pandoc (>=1.0)", "sentence-transformers", "transformers (>=4.13.0)", "twine (==3.2)", "datasets (>=1.16.1)", "pytest (>=5.0,<7.0)", "sphinxcontrib-mockautodoc", "torchvision", "segmentation-models-pytorch", "sphinx-copybutton (>=0.3)", "sphinx-paramlinks (>=0.5.1)"]
-docs = ["myst-parser (>=0.15)", "ipython", "sphinxcontrib-fulltoc (>=1.0)", "sphinx (>=4.0,<5.0)", "docutils (>=0.16)", "sphinxcontrib-mockautodoc", "sphinx-paramlinks (>=0.5.1)", "sphinx-copybutton (>=0.3)", "pandoc (>=1.0)", "sphinx-autodoc-typehints (>=1.0)", "sphinx-togglebutton (>=0.2)", "nbsphinx (>=0.8.5)"]
-graph = ["torch-cluster", "networkx", "torch-geometric (>=2.0.0)", "torch-sparse", "torch-scatter"]
-image = ["pystiche (>=1.0.0,<2.0.0)", "kornia (>=0.5.1)", "timm (>=0.4.5)", "Pillow (>=7.2)", "torchvision", "segmentation-models-pytorch", "lightning-bolts (>=0.3.3)"]
-image_extras = ["timm (>=0.4.5)", "Pillow (>=7.2)", "matplotlib", "pystiche (>=1.0.0,<2.0.0)", "effdet", "fastface", "albumentations", "baal", "lightning-bolts (>=0.3.3)", "kornia (>=0.5.1)", "icedata", "classy-vision", "fiftyone", "icevision (>=0.8)", "structlog (==21.1.0)", "torchvision", "learn2learn", "segmentation-models-pytorch", "vissl (>=0.1.5)", "fairscale"]
-notebooks = ["jupyter", "jupyter-client", "nbconvert"]
-pointcloud = ["open3d (==0.13)", "tensorboard", "torchvision", "torch (==1.7.1)"]
-serve = ["pillow", "uvicorn[standard] (>=0.12.0,<0.14.0)", "jinja2", "cytoolz", "graphviz", "aiofiles", "torchvision", "starlette (==0.14.2)", "numpy", "fastapi (>=0.65.2,<0.66.0)", "pyyaml", "tqdm", "pydantic (>1.8.1,<2.0.0)", "importlib-metadata (>=0.12,<3)"]
-tabular = ["scikit-learn", "pytorch-forecasting (>=0.9.0)", "pytorch-tabular (==0.7.0)"]
-test = ["check-manifest", "twine (==3.2)", "coverage", "pre-commit", "isort", "pytest (>=5.0,<7.0)", "codecov (>=2.1)", "pytest-flake8", "pytest-doctestplus (>=0.9.0)", "torch-optimizer", "flake8", "scikit-learn", "pytest-rerunfailures (>=10.0)", "pytest-mock"]
-text = ["torchmetrics[text] (>=0.5.1)", "transformers (>=4.5)", "sentencepiece (>=0.1.95)", "filelock", "datasets (>=1.8)", "sentence-transformers"]
-video = ["Pillow (>=7.2)", "kornia (>=0.5.1)", "torchvision", "pytorchvideo (==0.1.2)"]
-video_extras = ["kornia (>=0.5.1)", "pytorchvideo (==0.1.2)", "Pillow (>=7.2)", "fiftyone", "torchvision"]
-vision = ["pystiche (>=1.0.0,<2.0.0)", "kornia (>=0.5.1)", "pytorchvideo (==0.1.2)", "timm (>=0.4.5)", "Pillow (>=7.2)", "torchvision", "segmentation-models-pytorch", "lightning-bolts (>=0.3.3)"]
+all = ["datasets (>=1.8)", "torchmetrics[text] (>=0.5.1)", "segmentation-models-pytorch", "pytorchvideo (==0.1.2)", "pytorch-tabular (==0.7.0)", "sentencepiece (>=0.1.95)", "pytorch-forecasting (>=0.9.0)", "transformers (>=4.13.0)", "kornia (>=0.5.1)", "pystiche (>=1.0.0,<2.0.0)", "timm (>=0.4.5)", "lightning-bolts (>=0.3.3)", "torchaudio", "datasets (>=1.16.1)", "scikit-learn", "torchmetrics (<0.8.0)", "torchvision", "Pillow (>=7.2)", "filelock", "sentence-transformers", "librosa (>=0.8.1)", "transformers (>=4.5)", "omegaconf (<=2.1.1)"]
+audio = ["datasets (>=1.16.1)", "transformers (>=4.13.0)", "librosa (>=0.8.1)", "torchaudio"]
+core = ["torchvision", "torchmetrics (<0.8.0)", "pystiche (>=1.0.0,<2.0.0)", "Pillow (>=7.2)", "pytorch-tabular (==0.7.0)", "omegaconf (<=2.1.1)", "datasets (>=1.8)", "sentence-transformers", "transformers (>=4.5)", "torchmetrics[text] (>=0.5.1)", "timm (>=0.4.5)", "sentencepiece (>=0.1.95)", "lightning-bolts (>=0.3.3)", "pytorch-forecasting (>=0.9.0)", "filelock", "scikit-learn", "segmentation-models-pytorch", "kornia (>=0.5.1)"]
+dev = ["datasets (>=1.8)", "pandoc (>=1.0)", "torchmetrics[text] (>=0.5.1)", "pytest-flake8", "segmentation-models-pytorch", "pre-commit", "codecov (>=2.1)", "pt-lightning-sphinx-theme", "pytorchvideo (==0.1.2)", "sphinx (>=4.0,<5.0)", "sphinx-autodoc-typehints (>=1.0)", "sphinx-copybutton (>=0.3)", "sphinxcontrib-mockautodoc", "pytorch-tabular (==0.7.0)", "pytest-doctestplus (>=0.9.0)", "sphinxcontrib-fulltoc (>=1.0)", "sentencepiece (>=0.1.95)", "pytorch-forecasting (>=0.9.0)", "transformers (>=4.13.0)", "pytest-mock", "torch-optimizer", "kornia (>=0.5.1)", "twine (==3.2)", "pystiche (>=1.0.0,<2.0.0)", "flake8", "pytest (>=5.0,<7.0)", "coverage", "sphinx-paramlinks (>=0.5.1)", "sphinx-togglebutton (>=0.2)", "timm (>=0.4.5)", "pytest-rerunfailures (>=10.0)", "lightning-bolts (>=0.3.3)", "torchaudio", "datasets (>=1.16.1)", "docutils (>=0.16)", "check-manifest", "scikit-learn", "torchmetrics (<0.8.0)", "torchvision", "Pillow (>=7.2)", "nbsphinx (>=0.8.5)", "filelock", "isort", "myst-parser (>=0.15)", "jinja2 (>=3.0.0,<3.1.0)", "ipython", "sentence-transformers", "pytest-forked", "librosa (>=0.8.1)", "transformers (>=4.5)", "omegaconf (<=2.1.1)"]
+docs = ["sphinx (>=4.0,<5.0)", "sphinx-autodoc-typehints (>=1.0)", "sphinx-copybutton (>=0.3)", "sphinxcontrib-mockautodoc", "nbsphinx (>=0.8.5)", "sphinx-paramlinks (>=0.5.1)", "sphinxcontrib-fulltoc (>=1.0)", "sphinx-togglebutton (>=0.2)", "myst-parser (>=0.15)", "jinja2 (>=3.0.0,<3.1.0)", "docutils (>=0.16)", "ipython", "pandoc (>=1.0)", "pt-lightning-sphinx-theme"]
+graph = ["torch-geometric (>=2.0.0)", "networkx", "torch-scatter", "torch-sparse", "class-resolver (>=0.3.2)", "torch-cluster"]
+image = ["torchvision", "pystiche (>=1.0.0,<2.0.0)", "Pillow (>=7.2)", "timm (>=0.4.5)", "lightning-bolts (>=0.3.3)", "segmentation-models-pytorch", "kornia (>=0.5.1)"]
+image_extras = ["vissl (>=0.1.5)", "effdet", "segmentation-models-pytorch", "learn2learn", "matplotlib", "pytorch-lightning (<1.5.0)", "icevision (>=0.8)", "kornia (>=0.5.1)", "pystiche (>=1.0.0,<2.0.0)", "fastface", "icedata", "classy-vision", "timm (>=0.4.5)", "fiftyone", "lightning-bolts (>=0.3.3)", "fairscale", "torchmetrics (<0.8.0)", "torchvision", "albumentations", "Pillow (>=7.2)"]
+image_extras_baal = ["torchvision", "pystiche (>=1.0.0,<2.0.0)", "Pillow (>=7.2)", "baal (>=1.3.2)", "timm (>=0.4.5)", "lightning-bolts (>=0.3.3)", "segmentation-models-pytorch", "kornia (>=0.5.1)"]
+notebooks = ["jupyter-client", "nbconvert", "jupyter"]
+pointcloud = ["torchvision", "tensorboard", "torch (==1.7.1)", "open3d (==0.13)"]
+serve = ["pillow", "torchvision", "numpy", "cytoolz", "tqdm", "fastapi (>=0.65.2)", "pydantic (>1.8.1)", "graphviz", "uvicorn[standard] (>=0.12.0)", "aiofiles", "starlette (==0.14.2)", "jinja2 (>=3.0.0,<3.1.0)", "pyyaml"]
+tabular = ["torchmetrics (<0.8.0)", "pytorch-tabular (==0.7.0)", "pytorch-forecasting (>=0.9.0)", "scikit-learn", "omegaconf (<=2.1.1)"]
+test = ["codecov (>=2.1)", "twine (==3.2)", "flake8", "check-manifest", "pytest (>=5.0,<7.0)", "pytest-doctestplus (>=0.9.0)", "coverage", "torch-optimizer", "pytest-rerunfailures (>=10.0)", "isort", "pytest-mock", "pytest-forked", "pytest-flake8", "scikit-learn", "pre-commit"]
+text = ["datasets (>=1.8)", "sentence-transformers", "torchmetrics[text] (>=0.5.1)", "sentencepiece (>=0.1.95)", "filelock", "transformers (>=4.5)"]
+video = ["Pillow (>=7.2)", "torchvision", "pytorchvideo (==0.1.2)", "kornia (>=0.5.1)"]
+video_extras = ["torchvision", "pytorchvideo (==0.1.2)", "Pillow (>=7.2)", "fiftyone", "kornia (>=0.5.1)"]
+vision = ["torchvision", "pystiche (>=1.0.0,<2.0.0)", "pytorchvideo (==0.1.2)", "Pillow (>=7.2)", "timm (>=0.4.5)", "lightning-bolts (>=0.3.3)", "segmentation-models-pytorch", "kornia (>=0.5.1)"]
[package.source]
type = "git"
url = "https://github.com/PyTorchLightning/lightning-flash.git"
reference = "master"
-resolved_reference = "450902d713980e0edefcfd2d2a2a35eb875072d7"
+resolved_reference = "4bddc0841be95783270757383a0a6f5a26bdedc0"
[[package]]
name = "markdown"
@@ -1057,6 +858,28 @@ importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""}
[package.extras]
testing = ["coverage", "pyyaml"]
+[[package]]
+name = "markdown-it-py"
+version = "2.1.0"
+description = "Python port of markdown-it. Markdown parsing, done right!"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+mdurl = ">=0.1,<1.0"
+typing_extensions = {version = ">=3.7.4", markers = "python_version < \"3.8\""}
+
+[package.extras]
+benchmarking = ["psutil", "pytest", "pytest-benchmark (>=3.2,<4.0)"]
+code_style = ["pre-commit (==2.6)"]
+compare = ["commonmark (>=0.9.1,<0.10.0)", "markdown (>=3.3.6,<3.4.0)", "mistletoe (>=0.8.1,<0.9.0)", "mistune (>=2.0.2,<2.1.0)", "panflute (>=2.1.3,<2.2.0)"]
+linkify = ["linkify-it-py (>=1.0,<2.0)"]
+plugins = ["mdit-py-plugins"]
+profiling = ["gprof2dot"]
+rtd = ["attrs", "myst-parser", "pyyaml", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinx-book-theme"]
+testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
+
[[package]]
name = "markupsafe"
version = "2.0.1"
@@ -1085,23 +908,44 @@ python-dateutil = ">=2.7"
setuptools_scm = ">=4"
[[package]]
-name = "matplotlib-inline"
-version = "0.1.3"
-description = "Inline Matplotlib backend for Jupyter"
+name = "mccabe"
+version = "0.6.1"
+description = "McCabe checker, plugin for flake8"
category = "dev"
optional = false
-python-versions = ">=3.5"
+python-versions = "*"
+
+[[package]]
+name = "mdit-py-plugins"
+version = "0.3.0"
+description = "Collection of plugins for markdown-it-py"
+category = "dev"
+optional = false
+python-versions = "~=3.6"
[package.dependencies]
-traitlets = "*"
+markdown-it-py = ">=1.0.0,<3.0.0"
+
+[package.extras]
+code_style = ["pre-commit (==2.6)"]
+rtd = ["myst-parser (>=0.14.0,<0.15.0)", "sphinx-book-theme (>=0.1.0,<0.2.0)"]
+testing = ["coverage", "pytest (>=3.6,<4)", "pytest-cov", "pytest-regressions"]
[[package]]
-name = "mccabe"
-version = "0.6.1"
-description = "McCabe checker, plugin for flake8"
+name = "mdurl"
+version = "0.1.1"
+description = "Markdown URL utilities"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.7"
+
+[[package]]
+name = "mergedeep"
+version = "1.3.4"
+description = "A deep merge function for 🐍."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
[[package]]
name = "mistune"
@@ -1111,6 +955,68 @@ category = "dev"
optional = false
python-versions = "*"
+[[package]]
+name = "mkdocs"
+version = "1.3.0"
+description = "Project documentation with Markdown."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+
+[package.dependencies]
+click = ">=3.3"
+ghp-import = ">=1.0"
+importlib-metadata = ">=4.3"
+Jinja2 = ">=2.10.2"
+Markdown = ">=3.2.1"
+mergedeep = ">=1.3.4"
+packaging = ">=20.5"
+PyYAML = ">=3.10"
+pyyaml-env-tag = ">=0.1"
+watchdog = ">=2.0"
+
+[package.extras]
+i18n = ["babel (>=2.9.0)"]
+
+[[package]]
+name = "mkdocs-jupyter"
+version = "0.21.0"
+description = "Use Jupyter in mkdocs websites"
+category = "dev"
+optional = false
+python-versions = ">=3.7.1,<4"
+
+[package.dependencies]
+jupytext = ">=1.13.8,<2.0.0"
+mkdocs = ">=1.2.3,<2.0.0"
+mkdocs-material = ">=8.0.0,<9.0.0"
+nbconvert = ">=6.2.0,<7.0.0"
+Pygments = ">=2.12.0,<3.0.0"
+
+[[package]]
+name = "mkdocs-material"
+version = "8.3.6"
+description = "Documentation that simply works"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+jinja2 = ">=3.0.2"
+markdown = ">=3.2"
+mkdocs = ">=1.3.0"
+mkdocs-material-extensions = ">=1.0.3"
+pygments = ">=2.12"
+pymdown-extensions = ">=9.4"
+
+[[package]]
+name = "mkdocs-material-extensions"
+version = "1.0.3"
+description = "Extension pack for Python Markdown."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+
[[package]]
name = "multidict"
version = "5.2.0"
@@ -1238,22 +1144,6 @@ traitlets = ">=4.1"
fast = ["fastjsonschema"]
test = ["check-manifest", "fastjsonschema", "testpath", "pytest", "pytest-cov"]
-[[package]]
-name = "nbsphinx"
-version = "0.8.8"
-description = "Jupyter Notebook Tools for Sphinx"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-docutils = "*"
-jinja2 = "*"
-nbconvert = "!=5.4"
-nbformat = "*"
-sphinx = ">=1.8"
-traitlets = "*"
-
[[package]]
name = "nest-asyncio"
version = "1.5.4"
@@ -1262,36 +1152,6 @@ category = "dev"
optional = false
python-versions = ">=3.5"
-[[package]]
-name = "notebook"
-version = "6.4.6"
-description = "A web-based notebook environment for interactive computing"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-argon2-cffi = "*"
-ipykernel = "*"
-ipython-genutils = "*"
-jinja2 = "*"
-jupyter-client = ">=5.3.4"
-jupyter-core = ">=4.6.1"
-nbconvert = "*"
-nbformat = "*"
-nest-asyncio = ">=1.5"
-prometheus-client = "*"
-pyzmq = ">=17"
-Send2Trash = ">=1.8.0"
-terminado = ">=0.8.3"
-tornado = ">=6.1"
-traitlets = ">=4.2.1"
-
-[package.extras]
-docs = ["sphinx", "nbsphinx", "sphinxcontrib-github-alt", "sphinx-rtd-theme", "myst-parser"]
-json-logging = ["json-logging"]
-test = ["pytest", "coverage", "requests", "nbval", "selenium", "pytest-cov", "requests-unixsocket"]
-
[[package]]
name = "numpy"
version = "1.21.5"
@@ -1300,21 +1160,6 @@ category = "main"
optional = false
python-versions = ">=3.7,<3.11"
-[[package]]
-name = "numpydoc"
-version = "1.1.0"
-description = "Sphinx extension to support docstrings in Numpy format"
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.dependencies]
-Jinja2 = ">=2.3"
-sphinx = ">=1.6.5"
-
-[package.extras]
-testing = ["matplotlib", "pytest", "pytest-cov"]
-
[[package]]
name = "oauthlib"
version = "3.1.1"
@@ -1363,18 +1208,6 @@ category = "dev"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-[[package]]
-name = "parso"
-version = "0.8.3"
-description = "A Python Parser"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.extras]
-qa = ["flake8 (==3.8.3)", "mypy (==0.782)"]
-testing = ["docopt", "pytest (<6.0.0)"]
-
[[package]]
name = "pathspec"
version = "0.9.0"
@@ -1391,25 +1224,6 @@ category = "dev"
optional = false
python-versions = ">=2.6"
-[[package]]
-name = "pexpect"
-version = "4.8.0"
-description = "Pexpect allows easy control of interactive console applications."
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-ptyprocess = ">=0.5"
-
-[[package]]
-name = "pickleshare"
-version = "0.7.5"
-description = "Tiny 'shelve'-like database with concurrency support"
-category = "dev"
-optional = false
-python-versions = "*"
-
[[package]]
name = "pillow"
version = "9.0.0"
@@ -1459,28 +1273,6 @@ torch = "*"
torchvision = "*"
tqdm = "*"
-[[package]]
-name = "prometheus-client"
-version = "0.12.0"
-description = "Python client for the Prometheus monitoring system."
-category = "dev"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-
-[package.extras]
-twisted = ["twisted"]
-
-[[package]]
-name = "prompt-toolkit"
-version = "3.0.24"
-description = "Library for building powerful interactive command lines in Python"
-category = "dev"
-optional = false
-python-versions = ">=3.6.2"
-
-[package.dependencies]
-wcwidth = "*"
-
[[package]]
name = "protobuf"
version = "3.19.1"
@@ -1489,14 +1281,6 @@ category = "dev"
optional = false
python-versions = ">=3.5"
-[[package]]
-name = "ptyprocess"
-version = "0.7.0"
-description = "Run a subprocess in a pseudo terminal"
-category = "dev"
-optional = false
-python-versions = "*"
-
[[package]]
name = "py"
version = "1.11.0"
@@ -1569,11 +1353,22 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
[[package]]
name = "pygments"
-version = "2.11.2"
+version = "2.12.0"
description = "Pygments is a syntax highlighting package written in Python."
category = "dev"
optional = false
-python-versions = ">=3.5"
+python-versions = ">=3.6"
+
+[[package]]
+name = "pymdown-extensions"
+version = "9.5"
+description = "Extension pack for Python Markdown."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+markdown = ">=3.2"
[[package]]
name = "pyparsing"
@@ -1718,14 +1513,6 @@ category = "dev"
optional = false
python-versions = "*"
-[[package]]
-name = "pywinpty"
-version = "1.1.6"
-description = "Pseudo terminal support for Windows from Python."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
[[package]]
name = "pyyaml"
version = "6.0"
@@ -1735,29 +1522,27 @@ optional = false
python-versions = ">=3.6"
[[package]]
-name = "pyzmq"
-version = "22.3.0"
-description = "Python bindings for 0MQ"
+name = "pyyaml-env-tag"
+version = "0.1"
+description = "A custom YAML tag for referencing environment variables in YAML files. "
category = "dev"
optional = false
python-versions = ">=3.6"
[package.dependencies]
-cffi = {version = "*", markers = "implementation_name == \"pypy\""}
-py = {version = "*", markers = "implementation_name == \"pypy\""}
+pyyaml = "*"
[[package]]
-name = "recommonmark"
-version = "0.7.1"
-description = "A docutils-compatibility bridge to CommonMark, enabling you to write CommonMark inside of Docutils & Sphinx projects."
+name = "pyzmq"
+version = "22.3.0"
+description = "Python bindings for 0MQ"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
[package.dependencies]
-commonmark = ">=0.8.1"
-docutils = ">=0.11"
-sphinx = ">=1.3.1"
+cffi = {version = "*", markers = "implementation_name == \"pypy\""}
+py = {version = "*", markers = "implementation_name == \"pypy\""}
[[package]]
name = "regex"
@@ -1874,19 +1659,6 @@ torchvision = ">=0.5.0"
[package.extras]
test = ["pytest"]
-[[package]]
-name = "send2trash"
-version = "1.8.0"
-description = "Send file to trash natively under Mac OS X, Windows and Linux."
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.extras]
-nativelib = ["pyobjc-framework-cocoa", "pywin32"]
-objc = ["pyobjc-framework-cocoa"]
-win32 = ["pywin32"]
-
[[package]]
name = "setuptools-scm"
version = "6.3.2"
@@ -1918,160 +1690,6 @@ category = "dev"
optional = false
python-versions = ">=3.6"
-[[package]]
-name = "snowballstemmer"
-version = "2.2.0"
-description = "This package provides 29 stemmers for 28 languages generated from Snowball algorithms."
-category = "dev"
-optional = false
-python-versions = "*"
-
-[[package]]
-name = "sphinx"
-version = "4.3.2"
-description = "Python documentation generator"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-alabaster = ">=0.7,<0.8"
-babel = ">=1.3"
-colorama = {version = ">=0.3.5", markers = "sys_platform == \"win32\""}
-docutils = ">=0.14,<0.18"
-imagesize = "*"
-Jinja2 = ">=2.3"
-packaging = "*"
-Pygments = ">=2.0"
-requests = ">=2.5.0"
-snowballstemmer = ">=1.1"
-sphinxcontrib-applehelp = "*"
-sphinxcontrib-devhelp = "*"
-sphinxcontrib-htmlhelp = ">=2.0.0"
-sphinxcontrib-jsmath = "*"
-sphinxcontrib-qthelp = "*"
-sphinxcontrib-serializinghtml = ">=1.1.5"
-
-[package.extras]
-docs = ["sphinxcontrib-websupport"]
-lint = ["flake8 (>=3.5.0)", "isort", "mypy (>=0.920)", "docutils-stubs", "types-typed-ast", "types-pkg-resources", "types-requests"]
-test = ["pytest", "pytest-cov", "html5lib", "cython", "typed-ast"]
-
-[[package]]
-name = "sphinx-automodapi"
-version = "0.13"
-description = "Sphinx extension for auto-generating API documentation for entire modules"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-sphinx = ">=1.7"
-
-[package.extras]
-test = ["pytest", "pytest-cov", "cython", "codecov", "coverage (<5.0)"]
-
-[[package]]
-name = "sphinx-copybutton"
-version = "0.4.0"
-description = "Add a copy button to each of your code cells."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-sphinx = ">=1.8"
-
-[package.extras]
-code_style = ["pre-commit (==2.12.1)"]
-rtd = ["sphinx", "ipython", "sphinx-book-theme"]
-
-[[package]]
-name = "sphinx-rtd-theme"
-version = "0.5.2"
-description = "Read the Docs theme for Sphinx"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-docutils = "<0.17"
-sphinx = "*"
-
-[package.extras]
-dev = ["transifex-client", "sphinxcontrib-httpdomain", "bump2version"]
-
-[[package]]
-name = "sphinxcontrib-applehelp"
-version = "1.0.2"
-description = "sphinxcontrib-applehelp is a sphinx extension which outputs Apple help books"
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
-[[package]]
-name = "sphinxcontrib-devhelp"
-version = "1.0.2"
-description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
-[[package]]
-name = "sphinxcontrib-htmlhelp"
-version = "2.0.0"
-description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest", "html5lib"]
-
-[[package]]
-name = "sphinxcontrib-jsmath"
-version = "1.0.1"
-description = "A sphinx extension which renders display math in HTML via JavaScript"
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-test = ["pytest", "flake8", "mypy"]
-
-[[package]]
-name = "sphinxcontrib-qthelp"
-version = "1.0.3"
-description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
-[[package]]
-name = "sphinxcontrib-serializinghtml"
-version = "1.1.5"
-description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
[[package]]
name = "stevedore"
version = "3.5.0"
@@ -2137,22 +1755,6 @@ category = "dev"
optional = false
python-versions = "*"
-[[package]]
-name = "terminado"
-version = "0.12.1"
-description = "Tornado websocket backend for the Xterm.js Javascript terminal emulator library."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-ptyprocess = {version = "*", markers = "os_name != \"nt\""}
-pywinpty = {version = ">=1.1.0", markers = "os_name == \"nt\""}
-tornado = ">=4"
-
-[package.extras]
-test = ["pytest"]
-
[[package]]
name = "testpath"
version = "0.5.0"
@@ -2396,12 +1998,15 @@ secure = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "cer
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[[package]]
-name = "wcwidth"
-version = "0.2.5"
-description = "Measures the displayed width of unicode strings in a terminal"
+name = "watchdog"
+version = "2.1.9"
+description = "Filesystem events monitoring"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
+
+[package.extras]
+watchmedo = ["PyYAML (>=3.10)"]
[[package]]
name = "webencodings"
@@ -2422,17 +2027,6 @@ python-versions = ">=3.6"
[package.extras]
watchdog = ["watchdog"]
-[[package]]
-name = "widgetsnbextension"
-version = "3.5.2"
-description = "IPython HTML widgets for Jupyter"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-notebook = ">=4.4.1"
-
[[package]]
name = "xxhash"
version = "2.0.2"
@@ -2472,8 +2066,8 @@ vision = ["torchvision"]
[metadata]
lock-version = "1.1"
-python-versions = ">=3.7,<3.10"
-content-hash = "fc368b8c7d45218aac9f1cb98e7ef0bacf53aa953ac2e57cf7d9a25c6c775335"
+python-versions = ">=3.7.1,<4"
+content-hash = "3cbd5faf3ba5cc490829f3baf57a1758df1562e395f7800edb6878397bf46471"
[metadata.files]
absl-py = [
@@ -2558,49 +2152,6 @@ aiosignal = [
{file = "aiosignal-1.2.0-py3-none-any.whl", hash = "sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a"},
{file = "aiosignal-1.2.0.tar.gz", hash = "sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2"},
]
-alabaster = [
- {file = "alabaster-0.7.12-py2.py3-none-any.whl", hash = "sha256:446438bdcca0e05bd45ea2de1668c1d9b032e1a9154c2c259092d77031ddd359"},
- {file = "alabaster-0.7.12.tar.gz", hash = "sha256:a661d72d58e6ea8a57f7a86e37d86716863ee5e92788398526d58b26a4e4dc02"},
-]
-appnope = [
- {file = "appnope-0.1.2-py2.py3-none-any.whl", hash = "sha256:93aa393e9d6c54c5cd570ccadd8edad61ea0c4b9ea7a01409020c9aa019eb442"},
- {file = "appnope-0.1.2.tar.gz", hash = "sha256:dd83cd4b5b460958838f6eb3000c660b1f9caf2a5b1de4264e941512f603258a"},
-]
-argcomplete = [
- {file = "argcomplete-2.0.0-py2.py3-none-any.whl", hash = "sha256:cffa11ea77999bb0dd27bb25ff6dc142a6796142f68d45b1a26b11f58724561e"},
- {file = "argcomplete-2.0.0.tar.gz", hash = "sha256:6372ad78c89d662035101418ae253668445b391755cfe94ea52f1b9d22425b20"},
-]
-argon2-cffi = [
- {file = "argon2-cffi-21.3.0.tar.gz", hash = "sha256:d384164d944190a7dd7ef22c6aa3ff197da12962bd04b17f64d4e93d934dba5b"},
- {file = "argon2_cffi-21.3.0-py3-none-any.whl", hash = "sha256:8c976986f2c5c0e5000919e6de187906cfd81fb1c72bf9d88c01177e77da7f80"},
-]
-argon2-cffi-bindings = [
- {file = "argon2-cffi-bindings-21.2.0.tar.gz", hash = "sha256:bb89ceffa6c791807d1305ceb77dbfacc5aa499891d2c55661c6459651fc39e3"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:ccb949252cb2ab3a08c02024acb77cfb179492d5701c7cbdbfd776124d4d2367"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9524464572e12979364b7d600abf96181d3541da11e23ddf565a32e70bd4dc0d"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b746dba803a79238e925d9046a63aa26bf86ab2a2fe74ce6b009a1c3f5c8f2ae"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:58ed19212051f49a523abb1dbe954337dc82d947fb6e5a0da60f7c8471a8476c"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:bd46088725ef7f58b5a1ef7ca06647ebaf0eb4baff7d1d0d177c6cc8744abd86"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-musllinux_1_1_i686.whl", hash = "sha256:8cd69c07dd875537a824deec19f978e0f2078fdda07fd5c42ac29668dda5f40f"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f1152ac548bd5b8bcecfb0b0371f082037e47128653df2e8ba6e914d384f3c3e"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-win32.whl", hash = "sha256:603ca0aba86b1349b147cab91ae970c63118a0f30444d4bc80355937c950c082"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-win_amd64.whl", hash = "sha256:b2ef1c30440dbbcba7a5dc3e319408b59676e2e039e2ae11a8775ecf482b192f"},
- {file = "argon2_cffi_bindings-21.2.0-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:e415e3f62c8d124ee16018e491a009937f8cf7ebf5eb430ffc5de21b900dad93"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:3e385d1c39c520c08b53d63300c3ecc28622f076f4c2b0e6d7e796e9f6502194"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c3e3cc67fdb7d82c4718f19b4e7a87123caf8a93fde7e23cf66ac0337d3cb3f"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6a22ad9800121b71099d0fb0a65323810a15f2e292f2ba450810a7316e128ee5"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f9f8b450ed0547e3d473fdc8612083fd08dd2120d6ac8f73828df9b7d45bb351"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:93f9bf70084f97245ba10ee36575f0c3f1e7d7724d67d8e5b08e61787c320ed7"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:3b9ef65804859d335dc6b31582cad2c5166f0c3e7975f324d9ffaa34ee7e6583"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4966ef5848d820776f5f562a7d45fdd70c2f330c961d0d745b784034bd9f48d"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20ef543a89dee4db46a1a6e206cd015360e5a75822f76df533845c3cbaf72670"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ed2937d286e2ad0cc79a7087d3c272832865f779430e0cc2b4f3718d3159b0cb"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:5e00316dabdaea0b2dd82d141cc66889ced0cdcbfa599e8b471cf22c620c329a"},
-]
-asteroid-sphinx-theme = [
- {file = "asteroid_sphinx_theme-0.0.3-py2.py3-none-any.whl", hash = "sha256:5939dd3c71ce384f4c03ea715618700410e95dfefbda2f81a6a9a0a1795d712d"},
- {file = "asteroid_sphinx_theme-0.0.3.tar.gz", hash = "sha256:e780466db174cf2ec75cecd5321fc80e2820bc7563434a6a8351a61dcdd03d75"},
-]
async-timeout = [
{file = "async-timeout-4.0.2.tar.gz", hash = "sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15"},
{file = "async_timeout-4.0.2-py3-none-any.whl", hash = "sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c"},
@@ -2617,14 +2168,6 @@ attrs = [
{file = "attrs-21.4.0-py2.py3-none-any.whl", hash = "sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4"},
{file = "attrs-21.4.0.tar.gz", hash = "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"},
]
-babel = [
- {file = "Babel-2.9.1-py2.py3-none-any.whl", hash = "sha256:ab49e12b91d937cd11f0b67cb259a57ab4ad2b59ac7a3b41d6c06c0ac5b0def9"},
- {file = "Babel-2.9.1.tar.gz", hash = "sha256:bc0c176f9f6a994582230df350aa6e05ba2ebe4b3ac317eab29d9be5d2768da0"},
-]
-backcall = [
- {file = "backcall-0.2.0-py2.py3-none-any.whl", hash = "sha256:fbbce6a29f263178a1f7915c1940bde0ec2b2a967566fe1c65c1dfb7422bd255"},
- {file = "backcall-0.2.0.tar.gz", hash = "sha256:5cbdbf27be5e7cfadb448baf0aa95508f91f2bbc6c6437cd9cd06e2a4c215e1e"},
-]
bandit = [
{file = "bandit-1.7.1-py3-none-any.whl", hash = "sha256:f5acd838e59c038a159b5c621cf0f8270b279e884eadd7b782d7491c02add0d4"},
{file = "bandit-1.7.1.tar.gz", hash = "sha256:a81b00b5436e6880fa8ad6799bc830e02032047713cbb143a12939ac67eb756c"},
@@ -2713,10 +2256,6 @@ colorama = [
{file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"},
{file = "colorama-0.4.4.tar.gz", hash = "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b"},
]
-commonmark = [
- {file = "commonmark-0.9.1-py2.py3-none-any.whl", hash = "sha256:da2f38c92590f83de410ba1a3cbceafbc74fee9def35f9251ba9a971d6d66fd9"},
- {file = "commonmark-0.9.1.tar.gz", hash = "sha256:452f9dc859be7f06631ddcb328b6919c67984aca654e5fefb3914d54691aed60"},
-]
coverage = [
{file = "coverage-6.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6dbc1536e105adda7a6312c778f15aaabe583b0e9a0b0a324990334fd458c94b"},
{file = "coverage-6.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:174cf9b4bef0db2e8244f82059a5a72bd47e1d40e71c68ab055425172b16b7d0"},
@@ -2774,33 +2313,6 @@ datasets = [
{file = "datasets-1.17.0-py3-none-any.whl", hash = "sha256:8899a69914c7c19029a085e994f9b27dfeacd24b0d04c792e41e65cf1b033cab"},
{file = "datasets-1.17.0.tar.gz", hash = "sha256:3a9f4da403af86b0f6d3a1c44ba049cce52d5737c0a69cff487129d05e1a6d44"},
]
-debugpy = [
- {file = "debugpy-1.5.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:70b422c63a833630c33e3f9cdbd9b6971f8c5afd452697e464339a21bbe862ba"},
- {file = "debugpy-1.5.1-cp310-cp310-win32.whl", hash = "sha256:3a457ad9c0059a21a6c7d563c1f18e924f5cf90278c722bd50ede6f56b77c7fe"},
- {file = "debugpy-1.5.1-cp310-cp310-win_amd64.whl", hash = "sha256:5d76a4fd028d8009c3faf1185b4b78ceb2273dd2499447664b03939e0368bb90"},
- {file = "debugpy-1.5.1-cp36-cp36m-macosx_10_15_x86_64.whl", hash = "sha256:16db27b4b91991442f91d73604d32080b30de655aca9ba821b1972ea8171021b"},
- {file = "debugpy-1.5.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2b073ad5e8d8c488fbb6a116986858bab0c9c4558f28deb8832c7a5a27405bd6"},
- {file = "debugpy-1.5.1-cp36-cp36m-win32.whl", hash = "sha256:318f81f37341e4e054b4267d39896b73cddb3612ca13b39d7eea45af65165e1d"},
- {file = "debugpy-1.5.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b5b3157372e0e0a1297a8b6b5280bcf1d35a40f436c7973771c972726d1e32d5"},
- {file = "debugpy-1.5.1-cp37-cp37m-macosx_10_15_x86_64.whl", hash = "sha256:1ec3a086e14bba6c472632025b8fe5bdfbaef2afa1ebd5c6615ce6ed8d89bc67"},
- {file = "debugpy-1.5.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:26fbe53cca45a608679094791ce587b6e2798acd1d4777a8b303b07622e85182"},
- {file = "debugpy-1.5.1-cp37-cp37m-win32.whl", hash = "sha256:d876db8c312eeb02d85611e0f696abe66a2c1515e6405943609e725d5ff36f2a"},
- {file = "debugpy-1.5.1-cp37-cp37m-win_amd64.whl", hash = "sha256:4404a62fb5332ea5c8c9132290eef50b3a0ba38cecacad5529e969a783bcbdd7"},
- {file = "debugpy-1.5.1-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:f3a3dca9104aa14fd4210edcce6d9ce2b65bd9618c0b222135a40b9d6e2a9eeb"},
- {file = "debugpy-1.5.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:b2df2c373e85871086bd55271c929670cd4e1dba63e94a08d442db830646203b"},
- {file = "debugpy-1.5.1-cp38-cp38-win32.whl", hash = "sha256:82f5f9ce93af6861a0713f804e62ab390bb12a17f113153e47fea8bbb1dfbe36"},
- {file = "debugpy-1.5.1-cp38-cp38-win_amd64.whl", hash = "sha256:17a25ce9d7714f92fc97ef00cc06269d7c2b163094990ada30156ed31d9a5030"},
- {file = "debugpy-1.5.1-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:01e98c594b3e66d529e40edf314f849cd1a21f7a013298df58cd8e263bf8e184"},
- {file = "debugpy-1.5.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f73988422b17f071ad3c4383551ace1ba5ed810cbab5f9c362783d22d40a08dc"},
- {file = "debugpy-1.5.1-cp39-cp39-win32.whl", hash = "sha256:23df67fc56d59e386c342428a7953c2c06cc226d8525b11319153e96afb65b0c"},
- {file = "debugpy-1.5.1-cp39-cp39-win_amd64.whl", hash = "sha256:a2aa64f6d2ca7ded8a7e8a4e7cae3bc71866b09876b7b05cecad231779cb9156"},
- {file = "debugpy-1.5.1-py2.py3-none-any.whl", hash = "sha256:194f95dd3e84568b5489aab5689a3a2c044e8fdc06f1890b8b4f70b6b89f2778"},
- {file = "debugpy-1.5.1.zip", hash = "sha256:d2b09e91fbd1efa4f4fda121d49af89501beda50c18ed7499712c71a4bf3452e"},
-]
-decorator = [
- {file = "decorator-5.1.1-py3-none-any.whl", hash = "sha256:b8c3f85900b9dc423225913c5aace94729fe1fa9763b38939a95226f02d37186"},
- {file = "decorator-5.1.1.tar.gz", hash = "sha256:637996211036b6385ef91435e4fae22989472f9d571faba8927ba8253acbc330"},
-]
defusedxml = [
{file = "defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61"},
{file = "defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69"},
@@ -2910,12 +2422,16 @@ frozenlist = [
{file = "frozenlist-1.2.0.tar.gz", hash = "sha256:68201be60ac56aff972dc18085800b6ee07973c49103a8aba669dee3d71079de"},
]
fsspec = [
- {file = "fsspec-2022.1.0-py3-none-any.whl", hash = "sha256:256e2be44e62430c9ca8dac2e480384b00a3c52aef4e2b0b7204163fdc861d37"},
- {file = "fsspec-2022.1.0.tar.gz", hash = "sha256:0bdd519bbf4d8c9a1d893a50b5ebacc89acd0e1fe0045d2f7b0e0c1af5990edc"},
+ {file = "fsspec-2022.5.0-py3-none-any.whl", hash = "sha256:2c198c50eb541a80bbd03540b07602c4a957366f3fb416a1f270d34bd4ff0926"},
+ {file = "fsspec-2022.5.0.tar.gz", hash = "sha256:7a5459c75c44e760fbe6a3ccb1f37e81e023cde7da8ba20401258d877ec483b4"},
]
future = [
{file = "future-0.18.2.tar.gz", hash = "sha256:b1bead90b70cf6ec3f0710ae53a525360fa360d306a86583adc6bf83a4db537d"},
]
+ghp-import = [
+ {file = "ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343"},
+ {file = "ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619"},
+]
gitdb = [
{file = "gitdb-4.0.9-py3-none-any.whl", hash = "sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd"},
{file = "gitdb-4.0.9.tar.gz", hash = "sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa"},
@@ -3009,10 +2525,6 @@ idna = [
{file = "idna-3.3-py3-none-any.whl", hash = "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff"},
{file = "idna-3.3.tar.gz", hash = "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"},
]
-imagesize = [
- {file = "imagesize-1.3.0-py2.py3-none-any.whl", hash = "sha256:1db2f82529e53c3e929e8926a1fa9235aa82d0bd0c580359c67ec31b2fddaa8c"},
- {file = "imagesize-1.3.0.tar.gz", hash = "sha256:cd1750d452385ca327479d45b64d9c7729ecf0b3969a58148298c77092261f9d"},
-]
importlib-metadata = [
{file = "importlib_metadata-4.10.0-py3-none-any.whl", hash = "sha256:b7cf7d3fef75f1e4c80a96ca660efbd51473d7e8f39b5ab9210febc7809012a4"},
{file = "importlib_metadata-4.10.0.tar.gz", hash = "sha256:92a8b58ce734b2a4494878e0ecf7d79ccd7a128b5fc6014c401e0b61f006f0f6"},
@@ -3025,26 +2537,10 @@ iniconfig = [
{file = "iniconfig-1.1.1-py2.py3-none-any.whl", hash = "sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3"},
{file = "iniconfig-1.1.1.tar.gz", hash = "sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32"},
]
-ipykernel = [
- {file = "ipykernel-6.6.1-py3-none-any.whl", hash = "sha256:de99f6c1caa72578305cc96122ee3a19669e9c1958694a2b564ed1be28240ab9"},
- {file = "ipykernel-6.6.1.tar.gz", hash = "sha256:91ff0058b45660aad4a68088041059c0d378cd53fc8aff60e5abc91bcc049353"},
-]
-ipython = [
- {file = "ipython-7.31.1-py3-none-any.whl", hash = "sha256:55df3e0bd0f94e715abd968bedd89d4e8a7bce4bf498fb123fed4f5398fea874"},
- {file = "ipython-7.31.1.tar.gz", hash = "sha256:b5548ec5329a4bcf054a5deed5099b0f9622eb9ea51aaa7104d215fece201d8c"},
-]
ipython-genutils = [
{file = "ipython_genutils-0.2.0-py2.py3-none-any.whl", hash = "sha256:72dd37233799e619666c9f639a9da83c34013a73e8bbc79a7a6348d93c61fab8"},
{file = "ipython_genutils-0.2.0.tar.gz", hash = "sha256:eb2e116e75ecef9d4d228fdc66af54269afa26ab4463042e33785b887c628ba8"},
]
-ipywidgets = [
- {file = "ipywidgets-7.6.5-py2.py3-none-any.whl", hash = "sha256:d258f582f915c62ea91023299603be095de19afb5ee271698f88327b9fe9bf43"},
- {file = "ipywidgets-7.6.5.tar.gz", hash = "sha256:00974f7cb4d5f8d494c19810fedb9fa9b64bffd3cda7c2be23c133a1ad3c99c5"},
-]
-jedi = [
- {file = "jedi-0.18.1-py2.py3-none-any.whl", hash = "sha256:637c9635fcf47945ceb91cd7f320234a7be540ded6f3e99a50cb6febdfd1ba8d"},
- {file = "jedi-0.18.1.tar.gz", hash = "sha256:74137626a64a99c8eb6ae5832d99b3bdd7d29a3850fe2aa80a4126b2a7d949ab"},
-]
jinja2 = [
{file = "Jinja2-3.0.3-py3-none-any.whl", hash = "sha256:077ce6014f7b40d03b47d1f1ca4b0fc8328a692bd284016f806ed0eaca390ad8"},
{file = "Jinja2-3.0.3.tar.gz", hash = "sha256:611bb273cd68f3b993fabdc4064fc858c5b47a973cb5aa7999ec1ba405c87cd7"},
@@ -3054,8 +2550,8 @@ joblib = [
{file = "joblib-1.1.0.tar.gz", hash = "sha256:4158fcecd13733f8be669be0683b96ebdbbd38d23559f54dca7205aea1bf1e35"},
]
jsonargparse = [
- {file = "jsonargparse-4.1.4-py3-none-any.whl", hash = "sha256:322ad89c587bed36903f0db74f94fe58771cbd549f8c1f2c381d7a44b2a342d7"},
- {file = "jsonargparse-4.1.4.tar.gz", hash = "sha256:ec17d4fe1cab006bdda13185b77d94e181e9c499634a58a3f2a95f545f8bc185"},
+ {file = "jsonargparse-4.9.0-py3-none-any.whl", hash = "sha256:aecd494346c251dd34372239b9bafe46fc7d760f07dc548d6aac58176cf3fce2"},
+ {file = "jsonargparse-4.9.0.tar.gz", hash = "sha256:4a2f4194796eb5d1a36179efdfc7e6bc383d9757b977192b4b2a6ea39d04b69d"},
]
jsonschema = [
{file = "jsonschema-4.3.3-py3-none-any.whl", hash = "sha256:eb7a69801beb7325653aa8fd373abbf9ff8f85b536ab2812e5e8287b522fb6a2"},
@@ -3069,17 +2565,13 @@ jupyter-core = [
{file = "jupyter_core-4.9.1-py3-none-any.whl", hash = "sha256:1c091f3bbefd6f2a8782f2c1db662ca8478ac240e962ae2c66f0b87c818154ea"},
{file = "jupyter_core-4.9.1.tar.gz", hash = "sha256:dce8a7499da5a53ae3afd5a9f4b02e5df1d57250cf48f3ad79da23b4778cd6fa"},
]
-jupyter-sphinx = [
- {file = "jupyter_sphinx-0.3.2-py3-none-any.whl", hash = "sha256:301e36d0fb3007bb5802f6b65b60c24990eb99c983332a2ab6eecff385207dc9"},
- {file = "jupyter_sphinx-0.3.2.tar.gz", hash = "sha256:37fc9408385c45326ac79ca0452fbd7ae2bf0e97842d626d2844d4830e30aaf2"},
-]
jupyterlab-pygments = [
{file = "jupyterlab_pygments-0.1.2-py2.py3-none-any.whl", hash = "sha256:abfb880fd1561987efaefcb2d2ac75145d2a5d0139b1876d5be806e32f630008"},
{file = "jupyterlab_pygments-0.1.2.tar.gz", hash = "sha256:cfcda0873626150932f438eccf0f8bf22bfa92345b814890ab360d666b254146"},
]
-jupyterlab-widgets = [
- {file = "jupyterlab_widgets-1.0.2-py3-none-any.whl", hash = "sha256:f5d9efface8ec62941173ba1cffb2edd0ecddc801c11ae2931e30b50492eb8f7"},
- {file = "jupyterlab_widgets-1.0.2.tar.gz", hash = "sha256:7885092b2b96bf189c3a705cc3c412a4472ec5e8382d0b47219a66cccae73cfa"},
+jupytext = [
+ {file = "jupytext-1.13.8-py3-none-any.whl", hash = "sha256:625d2d2012763cc87d3f0dd60383516cec442c11894f53ad0c5ee5aa2a52caa2"},
+ {file = "jupytext-1.13.8.tar.gz", hash = "sha256:60148537de5aa08bb9cbe8797500a49360b7a8eb6667736ae5b80e3ec7ba084d"},
]
kiwisolver = [
{file = "kiwisolver-1.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1d819553730d3c2724582124aee8a03c846ec4362ded1034c16fb3ef309264e6"},
@@ -3140,6 +2632,10 @@ markdown = [
{file = "Markdown-3.3.6-py3-none-any.whl", hash = "sha256:9923332318f843411e9932237530df53162e29dc7a4e2b91e35764583c46c9a3"},
{file = "Markdown-3.3.6.tar.gz", hash = "sha256:76df8ae32294ec39dcf89340382882dfa12975f87f45c3ed1ecdb1e8cefc7006"},
]
+markdown-it-py = [
+ {file = "markdown-it-py-2.1.0.tar.gz", hash = "sha256:cf7e59fed14b5ae17c0006eff14a2d9a00ed5f3a846148153899a0224e2c07da"},
+ {file = "markdown_it_py-2.1.0-py3-none-any.whl", hash = "sha256:93de681e5c021a432c63147656fe21790bc01231e0cd2da73626f1aa3ac0fe27"},
+]
markupsafe = [
{file = "MarkupSafe-2.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d8446c54dc28c01e5a2dbac5a25f071f6653e6e40f3a8818e8b45d790fe6ef53"},
{file = "MarkupSafe-2.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:36bc903cbb393720fad60fc28c10de6acf10dc6cc883f3e24ee4012371399a38"},
@@ -3233,18 +2729,41 @@ matplotlib = [
{file = "matplotlib-3.5.1-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:14334b9902ec776461c4b8c6516e26b450f7ebe0b3ef8703bf5cdfbbaecf774a"},
{file = "matplotlib-3.5.1.tar.gz", hash = "sha256:b2e9810e09c3a47b73ce9cab5a72243a1258f61e7900969097a817232246ce1c"},
]
-matplotlib-inline = [
- {file = "matplotlib-inline-0.1.3.tar.gz", hash = "sha256:a04bfba22e0d1395479f866853ec1ee28eea1485c1d69a6faf00dc3e24ff34ee"},
- {file = "matplotlib_inline-0.1.3-py3-none-any.whl", hash = "sha256:aed605ba3b72462d64d475a21a9296f400a19c4f74a31b59103d2a99ffd5aa5c"},
-]
mccabe = [
{file = "mccabe-0.6.1-py2.py3-none-any.whl", hash = "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42"},
{file = "mccabe-0.6.1.tar.gz", hash = "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"},
]
+mdit-py-plugins = [
+ {file = "mdit-py-plugins-0.3.0.tar.gz", hash = "sha256:ecc24f51eeec6ab7eecc2f9724e8272c2fb191c2e93cf98109120c2cace69750"},
+ {file = "mdit_py_plugins-0.3.0-py3-none-any.whl", hash = "sha256:b1279701cee2dbf50e188d3da5f51fee8d78d038cdf99be57c6b9d1aa93b4073"},
+]
+mdurl = [
+ {file = "mdurl-0.1.1-py3-none-any.whl", hash = "sha256:6a8f6804087b7128040b2fb2ebe242bdc2affaeaa034d5fc9feeed30b443651b"},
+ {file = "mdurl-0.1.1.tar.gz", hash = "sha256:f79c9709944df218a4cdb0fcc0b0c7ead2f44594e3e84dc566606f04ad749c20"},
+]
+mergedeep = [
+ {file = "mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307"},
+ {file = "mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8"},
+]
mistune = [
{file = "mistune-0.8.4-py2.py3-none-any.whl", hash = "sha256:88a1051873018da288eee8538d476dffe1262495144b33ecb586c4ab266bb8d4"},
{file = "mistune-0.8.4.tar.gz", hash = "sha256:59a3429db53c50b5c6bcc8a07f8848cb00d7dc8bdb431a4ab41920d201d4756e"},
]
+mkdocs = [
+ {file = "mkdocs-1.3.0-py3-none-any.whl", hash = "sha256:26bd2b03d739ac57a3e6eed0b7bcc86168703b719c27b99ad6ca91dc439aacde"},
+ {file = "mkdocs-1.3.0.tar.gz", hash = "sha256:b504405b04da38795fec9b2e5e28f6aa3a73bb0960cb6d5d27ead28952bd35ea"},
+]
+mkdocs-jupyter = [
+ {file = "mkdocs-jupyter-0.21.0.tar.gz", hash = "sha256:c8c00ce44456e3cf50c5dc3fe0cb18fab6467fb5bafc2c0bfe1efff3e0a52470"},
+]
+mkdocs-material = [
+ {file = "mkdocs-material-8.3.6.tar.gz", hash = "sha256:be8f95c0dfb927339b55b2cc066423dc0b381be9828ff74a5b02df979a859b66"},
+ {file = "mkdocs_material-8.3.6-py2.py3-none-any.whl", hash = "sha256:01f3fbab055751b3b75a64b538e86b9ce0c6a0f8d43620f6287dfa16534443e5"},
+]
+mkdocs-material-extensions = [
+ {file = "mkdocs-material-extensions-1.0.3.tar.gz", hash = "sha256:bfd24dfdef7b41c312ede42648f9eb83476ea168ec163b613f9abd12bbfddba2"},
+ {file = "mkdocs_material_extensions-1.0.3-py3-none-any.whl", hash = "sha256:a82b70e533ce060b2a5d9eb2bc2e1be201cf61f901f93704b4acf6e3d5983a44"},
+]
multidict = [
{file = "multidict-5.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3822c5894c72e3b35aae9909bef66ec83e44522faf767c0ad39e0e2de11d3b55"},
{file = "multidict-5.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:28e6d883acd8674887d7edc896b91751dc2d8e87fbdca8359591a13872799e4e"},
@@ -3379,18 +2898,10 @@ nbformat = [
{file = "nbformat-5.1.3-py3-none-any.whl", hash = "sha256:eb8447edd7127d043361bc17f2f5a807626bc8e878c7709a1c647abda28a9171"},
{file = "nbformat-5.1.3.tar.gz", hash = "sha256:b516788ad70771c6250977c1374fcca6edebe6126fd2adb5a69aa5c2356fd1c8"},
]
-nbsphinx = [
- {file = "nbsphinx-0.8.8-py3-none-any.whl", hash = "sha256:c6c3875f8735b9ea57d65f81a7e240542daa613cad10661c54e0adee4e77938c"},
- {file = "nbsphinx-0.8.8.tar.gz", hash = "sha256:b5090c824b330b36c2715065a1a179ad36526bff208485a9865453d1ddfc34ec"},
-]
nest-asyncio = [
{file = "nest_asyncio-1.5.4-py3-none-any.whl", hash = "sha256:3fdd0d6061a2bb16f21fe8a9c6a7945be83521d81a0d15cff52e9edee50101d6"},
{file = "nest_asyncio-1.5.4.tar.gz", hash = "sha256:f969f6013a16fadb4adcf09d11a68a4f617c6049d7af7ac2c676110169a63abd"},
]
-notebook = [
- {file = "notebook-6.4.6-py3-none-any.whl", hash = "sha256:5cad068fa82cd4fb98d341c052100ed50cd69fbfb4118cb9b8ab5a346ef27551"},
- {file = "notebook-6.4.6.tar.gz", hash = "sha256:7bcdf79bd1cda534735bd9830d2cbedab4ee34d8fe1df6e7b946b3aab0902ba3"},
-]
numpy = [
{file = "numpy-1.21.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:301e408a052fdcda5cdcf03021ebafc3c6ea093021bf9d1aa47c54d48bdad166"},
{file = "numpy-1.21.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a7e8f6216f180f3fd4efb73de5d1eaefb5f5a1ee5b645c67333033e39440e63a"},
@@ -3423,10 +2934,6 @@ numpy = [
{file = "numpy-1.21.5-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:a7c4b701ca418cd39e28ec3b496e6388fe06de83f5f0cb74794fa31cfa384c02"},
{file = "numpy-1.21.5.zip", hash = "sha256:6a5928bc6241264dce5ed509e66f33676fc97f464e7a919edc672fb5532221ee"},
]
-numpydoc = [
- {file = "numpydoc-1.1.0-py3-none-any.whl", hash = "sha256:c53d6311190b9e3b9285bc979390ba0257ba9acde5eca1a7065fc8dfca9d46e8"},
- {file = "numpydoc-1.1.0.tar.gz", hash = "sha256:c36fd6cb7ffdc9b4e165a43f67bf6271a7b024d0bb6b00ac468c9e2bfc76448e"},
-]
oauthlib = [
{file = "oauthlib-3.1.1-py2.py3-none-any.whl", hash = "sha256:42bf6354c2ed8c6acb54d971fce6f88193d97297e18602a3a886603f9d7730cc"},
{file = "oauthlib-3.1.1.tar.gz", hash = "sha256:8f0215fcc533dd8dd1bee6f4c412d4f0cd7297307d43ac61666389e3bc3198a3"},
@@ -3465,10 +2972,6 @@ pandocfilters = [
{file = "pandocfilters-1.5.0-py2.py3-none-any.whl", hash = "sha256:33aae3f25fd1a026079f5d27bdd52496f0e0803b3469282162bafdcbdf6ef14f"},
{file = "pandocfilters-1.5.0.tar.gz", hash = "sha256:0b679503337d233b4339a817bfc8c50064e2eff681314376a47cb582305a7a38"},
]
-parso = [
- {file = "parso-0.8.3-py2.py3-none-any.whl", hash = "sha256:c001d4636cd3aecdaf33cbb40aebb59b094be2a74c556778ef5576c175e19e75"},
- {file = "parso-0.8.3.tar.gz", hash = "sha256:8c07be290bb59f03588915921e29e8a50002acaf2cdc5fa0e0114f91709fafa0"},
-]
pathspec = [
{file = "pathspec-0.9.0-py2.py3-none-any.whl", hash = "sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a"},
{file = "pathspec-0.9.0.tar.gz", hash = "sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1"},
@@ -3477,14 +2980,6 @@ pbr = [
{file = "pbr-5.8.0-py2.py3-none-any.whl", hash = "sha256:176e8560eaf61e127817ef93d8a844803abb27a4d4637f0ff3bb783129be2e0a"},
{file = "pbr-5.8.0.tar.gz", hash = "sha256:672d8ebee84921862110f23fcec2acea191ef58543d34dfe9ef3d9f13c31cddf"},
]
-pexpect = [
- {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
- {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
-]
-pickleshare = [
- {file = "pickleshare-0.7.5-py2.py3-none-any.whl", hash = "sha256:9649af414d74d4df115d5d718f82acb59c9d418196b7b4290ed47a12ce62df56"},
- {file = "pickleshare-0.7.5.tar.gz", hash = "sha256:87683d47965c1da65cdacaf31c8441d12b8044cdec9aca500cd78fc2c683afca"},
-]
pillow = [
{file = "Pillow-9.0.0-cp310-cp310-macosx_10_10_universal2.whl", hash = "sha256:113723312215b25c22df1fdf0e2da7a3b9c357a7d24a93ebbe80bfda4f37a8d4"},
{file = "Pillow-9.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bb47a548cea95b86494a26c89d153fd31122ed65255db5dcbc421a2d28eb3379"},
@@ -3530,14 +3025,6 @@ pluggy = [
pretrainedmodels = [
{file = "pretrainedmodels-0.7.4.tar.gz", hash = "sha256:7e77ead4619a3e11ab3c41982c8ad5b86edffe37c87fd2a37ec3c2cc6470b98a"},
]
-prometheus-client = [
- {file = "prometheus_client-0.12.0-py2.py3-none-any.whl", hash = "sha256:317453ebabff0a1b02df7f708efbab21e3489e7072b61cb6957230dd004a0af0"},
- {file = "prometheus_client-0.12.0.tar.gz", hash = "sha256:1b12ba48cee33b9b0b9de64a1047cbd3c5f2d0ab6ebcead7ddda613a750ec3c5"},
-]
-prompt-toolkit = [
- {file = "prompt_toolkit-3.0.24-py3-none-any.whl", hash = "sha256:e56f2ff799bacecd3e88165b1e2f5ebf9bcd59e80e06d395fa0cc4b8bd7bb506"},
- {file = "prompt_toolkit-3.0.24.tar.gz", hash = "sha256:1bb05628c7d87b645974a1bad3f17612be0c29fa39af9f7688030163f680bad6"},
-]
protobuf = [
{file = "protobuf-3.19.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d80f80eb175bf5f1169139c2e0c5ada98b1c098e2b3c3736667f28cbbea39fc8"},
{file = "protobuf-3.19.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:a529e7df52204565bcd33738a7a5f288f3d2d37d86caa5d78c458fa5fabbd54d"},
@@ -3564,10 +3051,6 @@ protobuf = [
{file = "protobuf-3.19.1-py2.py3-none-any.whl", hash = "sha256:e813b1c9006b6399308e917ac5d298f345d95bb31f46f02b60cd92970a9afa17"},
{file = "protobuf-3.19.1.tar.gz", hash = "sha256:62a8e4baa9cb9e064eb62d1002eca820857ab2138440cb4b3ea4243830f94ca7"},
]
-ptyprocess = [
- {file = "ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35"},
- {file = "ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220"},
-]
py = [
{file = "py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"},
{file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"},
@@ -3657,8 +3140,12 @@ pyflakes = [
{file = "pyflakes-2.3.1.tar.gz", hash = "sha256:f5bc8ecabc05bb9d291eb5203d6810b49040f6ff446a756326104746cc00c1db"},
]
pygments = [
- {file = "Pygments-2.11.2-py3-none-any.whl", hash = "sha256:44238f1b60a76d78fc8ca0528ee429702aae011c265fe6a8dd8b63049ae41c65"},
- {file = "Pygments-2.11.2.tar.gz", hash = "sha256:4e426f72023d88d03b2fa258de560726ce890ff3b630f88c21cbb8b2503b8c6a"},
+ {file = "Pygments-2.12.0-py3-none-any.whl", hash = "sha256:dc9c10fb40944260f6ed4c688ece0cd2048414940f1cea51b8b226318411c519"},
+ {file = "Pygments-2.12.0.tar.gz", hash = "sha256:5eb116118f9612ff1ee89ac96437bb6b49e8f04d8a13b514ba26f620208e26eb"},
+]
+pymdown-extensions = [
+ {file = "pymdown_extensions-9.5-py3-none-any.whl", hash = "sha256:ec141c0f4983755349f0c8710416348d1a13753976c028186ed14f190c8061c4"},
+ {file = "pymdown_extensions-9.5.tar.gz", hash = "sha256:3ef2d998c0d5fa7eb09291926d90d69391283561cf6306f85cd588a5eb5befa0"},
]
pyparsing = [
{file = "pyparsing-3.0.6-py3-none-any.whl", hash = "sha256:04ff808a5b90911829c55c4e26f75fa5ca8a2f5f36aa3a51f68e27033341d3e4"},
@@ -3729,14 +3216,6 @@ pywin32 = [
{file = "pywin32-303-cp39-cp39-win32.whl", hash = "sha256:7d3271c98434617a11921c5ccf74615794d97b079e22ed7773790822735cc352"},
{file = "pywin32-303-cp39-cp39-win_amd64.whl", hash = "sha256:79cbb862c11b9af19bcb682891c1b91942ec2ff7de8151e2aea2e175899cda34"},
]
-pywinpty = [
- {file = "pywinpty-1.1.6-cp310-none-win_amd64.whl", hash = "sha256:5f526f21b569b5610a61e3b6126259c76da979399598e5154498582df3736ade"},
- {file = "pywinpty-1.1.6-cp36-none-win_amd64.whl", hash = "sha256:7576e14f42b31fa98b62d24ded79754d2ea4625570c016b38eb347ce158a30f2"},
- {file = "pywinpty-1.1.6-cp37-none-win_amd64.whl", hash = "sha256:979ffdb9bdbe23db3f46fc7285fd6dbb86b80c12325a50582b211b3894072354"},
- {file = "pywinpty-1.1.6-cp38-none-win_amd64.whl", hash = "sha256:2308b1fc77545427610a705799d4ead5e7f00874af3fb148a03e202437456a7e"},
- {file = "pywinpty-1.1.6-cp39-none-win_amd64.whl", hash = "sha256:c703bf569a98ab7844b9daf37e88ab86f31862754ef6910a8b3824993a525c72"},
- {file = "pywinpty-1.1.6.tar.gz", hash = "sha256:8808f07350c709119cc4464144d6e749637f98e15acc1e5d3c37db1953d2eebc"},
-]
pyyaml = [
{file = "PyYAML-6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4db7c7aef085872ef65a8fd7d6d09a14ae91f691dec3e87ee5ee0539d516f53"},
{file = "PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9df7ed3b3d2e0ecfe09e14741b857df43adb5a3ddadc919a2d94fbdf78fea53c"},
@@ -3772,6 +3251,10 @@ pyyaml = [
{file = "PyYAML-6.0-cp39-cp39-win_amd64.whl", hash = "sha256:b3d267842bf12586ba6c734f89d1f5b871df0273157918b0ccefa29deb05c21c"},
{file = "PyYAML-6.0.tar.gz", hash = "sha256:68fb519c14306fec9720a2a5b45bc9f0c8d1b9c72adf45c37baedfcd949c35a2"},
]
+pyyaml-env-tag = [
+ {file = "pyyaml_env_tag-0.1-py3-none-any.whl", hash = "sha256:af31106dec8a4d68c60207c1886031cbf839b68aa7abccdb19868200532c2069"},
+ {file = "pyyaml_env_tag-0.1.tar.gz", hash = "sha256:70092675bda14fdec33b31ba77e7543de9ddc88f2e5b99160396572d11525bdb"},
+]
pyzmq = [
{file = "pyzmq-22.3.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:6b217b8f9dfb6628f74b94bdaf9f7408708cb02167d644edca33f38746ca12dd"},
{file = "pyzmq-22.3.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2841997a0d85b998cbafecb4183caf51fd19c4357075dfd33eb7efea57e4c149"},
@@ -3811,10 +3294,6 @@ pyzmq = [
{file = "pyzmq-22.3.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:d6157793719de168b199194f6b6173f0ccd3bf3499e6870fac17086072e39115"},
{file = "pyzmq-22.3.0.tar.gz", hash = "sha256:8eddc033e716f8c91c6a2112f0a8ebc5e00532b4a6ae1eb0ccc48e027f9c671c"},
]
-recommonmark = [
- {file = "recommonmark-0.7.1-py2.py3-none-any.whl", hash = "sha256:1b1db69af0231efce3fa21b94ff627ea33dee7079a01dd0a7f8482c3da148b3f"},
- {file = "recommonmark-0.7.1.tar.gz", hash = "sha256:bdb4db649f2222dcd8d2d844f0006b958d627f732415d399791ee436a3686d67"},
-]
regex = [
{file = "regex-2021.11.10-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9345b6f7ee578bad8e475129ed40123d265464c4cfead6c261fd60fc9de00bcf"},
{file = "regex-2021.11.10-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:416c5f1a188c91e3eb41e9c8787288e707f7d2ebe66e0a6563af280d9b68478f"},
@@ -3952,10 +3431,6 @@ segmentation-models-pytorch = [
{file = "segmentation_models_pytorch-0.2.1-py3-none-any.whl", hash = "sha256:98822571470867fb0f416c112c32f7f1d21702dd32195ec8f7736932c2de0486"},
{file = "segmentation_models_pytorch-0.2.1.tar.gz", hash = "sha256:86744552b04c6bedf7e10f7928791894d8d9b399b9ed58ed1a3236d2bf69ead6"},
]
-send2trash = [
- {file = "Send2Trash-1.8.0-py3-none-any.whl", hash = "sha256:f20eaadfdb517eaca5ce077640cb261c7d2698385a6a0f072a4a5447fd49fa08"},
- {file = "Send2Trash-1.8.0.tar.gz", hash = "sha256:d2c24762fd3759860a0aff155e45871447ea58d2be6bdd39b5c8f966a0c99c2d"},
-]
setuptools-scm = [
{file = "setuptools_scm-6.3.2-py3-none-any.whl", hash = "sha256:4c64444b1d49c4063ae60bfe1680f611c8b13833d556fd1d6050c0023162a119"},
{file = "setuptools_scm-6.3.2.tar.gz", hash = "sha256:a49aa8081eeb3514eb9728fa5040f2eaa962d6c6f4ec9c32f6c1fba88f88a0f2"},
@@ -3968,50 +3443,6 @@ smmap = [
{file = "smmap-5.0.0-py3-none-any.whl", hash = "sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94"},
{file = "smmap-5.0.0.tar.gz", hash = "sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936"},
]
-snowballstemmer = [
- {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"},
- {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
-]
-sphinx = [
- {file = "Sphinx-4.3.2-py3-none-any.whl", hash = "sha256:6a11ea5dd0bdb197f9c2abc2e0ce73e01340464feaece525e64036546d24c851"},
- {file = "Sphinx-4.3.2.tar.gz", hash = "sha256:0a8836751a68306b3fe97ecbe44db786f8479c3bf4b80e3a7f5c838657b4698c"},
-]
-sphinx-automodapi = [
- {file = "sphinx-automodapi-0.13.tar.gz", hash = "sha256:e1019336df7f7f0bcbf848eff7b84e7bef71691a57d8b5bda9107a2a046a226a"},
- {file = "sphinx_automodapi-0.13-py3-none-any.whl", hash = "sha256:f9ebc9c10597f3aab1d93e5a8b1829903eee7c64f5bafb0cf71fd40e5c7d95f0"},
-]
-sphinx-copybutton = [
- {file = "sphinx-copybutton-0.4.0.tar.gz", hash = "sha256:8daed13a87afd5013c3a9af3575cc4d5bec052075ccd3db243f895c07a689386"},
- {file = "sphinx_copybutton-0.4.0-py3-none-any.whl", hash = "sha256:4340d33c169dac6dd82dce2c83333412aa786a42dd01a81a8decac3b130dc8b0"},
-]
-sphinx-rtd-theme = [
- {file = "sphinx_rtd_theme-0.5.2-py2.py3-none-any.whl", hash = "sha256:4a05bdbe8b1446d77a01e20a23ebc6777c74f43237035e76be89699308987d6f"},
- {file = "sphinx_rtd_theme-0.5.2.tar.gz", hash = "sha256:32bd3b5d13dc8186d7a42fc816a23d32e83a4827d7d9882948e7b837c232da5a"},
-]
-sphinxcontrib-applehelp = [
- {file = "sphinxcontrib-applehelp-1.0.2.tar.gz", hash = "sha256:a072735ec80e7675e3f432fcae8610ecf509c5f1869d17e2eecff44389cdbc58"},
- {file = "sphinxcontrib_applehelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:806111e5e962be97c29ec4c1e7fe277bfd19e9652fb1a4392105b43e01af885a"},
-]
-sphinxcontrib-devhelp = [
- {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"},
- {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"},
-]
-sphinxcontrib-htmlhelp = [
- {file = "sphinxcontrib-htmlhelp-2.0.0.tar.gz", hash = "sha256:f5f8bb2d0d629f398bf47d0d69c07bc13b65f75a81ad9e2f71a63d4b7a2f6db2"},
- {file = "sphinxcontrib_htmlhelp-2.0.0-py2.py3-none-any.whl", hash = "sha256:d412243dfb797ae3ec2b59eca0e52dac12e75a241bf0e4eb861e450d06c6ed07"},
-]
-sphinxcontrib-jsmath = [
- {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
- {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
-]
-sphinxcontrib-qthelp = [
- {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"},
- {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"},
-]
-sphinxcontrib-serializinghtml = [
- {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"},
- {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
-]
stevedore = [
{file = "stevedore-3.5.0-py3-none-any.whl", hash = "sha256:a547de73308fd7e90075bb4d301405bebf705292fa90a90fc3bcf9133f58616c"},
{file = "stevedore-3.5.0.tar.gz", hash = "sha256:f40253887d8712eaa2bb0ea3830374416736dc8ec0e22f5a65092c1174c44335"},
@@ -4031,10 +3462,6 @@ tensorboard-data-server = [
tensorboard-plugin-wit = [
{file = "tensorboard_plugin_wit-1.8.1-py3-none-any.whl", hash = "sha256:ff26bdd583d155aa951ee3b152b3d0cffae8005dc697f72b44a8e8c2a77a8cbe"},
]
-terminado = [
- {file = "terminado-0.12.1-py3-none-any.whl", hash = "sha256:09fdde344324a1c9c6e610ee4ca165c4bb7f5bbf982fceeeb38998a988ef8452"},
- {file = "terminado-0.12.1.tar.gz", hash = "sha256:b20fd93cc57c1678c799799d117874367cc07a3d2d55be95205b1a88fa08393f"},
-]
testpath = [
{file = "testpath-0.5.0-py3-none-any.whl", hash = "sha256:8044f9a0bab6567fc644a3593164e872543bb44225b0e24846e2c89237937589"},
{file = "testpath-0.5.0.tar.gz", hash = "sha256:1acf7a0bcd3004ae8357409fc33751e16d37ccc650921da1094a86581ad1e417"},
@@ -4217,9 +3644,32 @@ urllib3 = [
{file = "urllib3-1.26.8-py2.py3-none-any.whl", hash = "sha256:000ca7f471a233c2251c6c7023ee85305721bfdf18621ebff4fd17a8653427ed"},
{file = "urllib3-1.26.8.tar.gz", hash = "sha256:0e7c33d9a63e7ddfcb86780aac87befc2fbddf46c58dbb487e0855f7ceec283c"},
]
-wcwidth = [
- {file = "wcwidth-0.2.5-py2.py3-none-any.whl", hash = "sha256:beb4802a9cebb9144e99086eff703a642a13d6a0052920003a230f3294bbe784"},
- {file = "wcwidth-0.2.5.tar.gz", hash = "sha256:c4d647b99872929fdb7bdcaa4fbe7f01413ed3d98077df798530e5b04f116c83"},
+watchdog = [
+ {file = "watchdog-2.1.9-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a735a990a1095f75ca4f36ea2ef2752c99e6ee997c46b0de507ba40a09bf7330"},
+ {file = "watchdog-2.1.9-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6b17d302850c8d412784d9246cfe8d7e3af6bcd45f958abb2d08a6f8bedf695d"},
+ {file = "watchdog-2.1.9-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ee3e38a6cc050a8830089f79cbec8a3878ec2fe5160cdb2dc8ccb6def8552658"},
+ {file = "watchdog-2.1.9-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:64a27aed691408a6abd83394b38503e8176f69031ca25d64131d8d640a307591"},
+ {file = "watchdog-2.1.9-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:195fc70c6e41237362ba720e9aaf394f8178bfc7fa68207f112d108edef1af33"},
+ {file = "watchdog-2.1.9-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:bfc4d351e6348d6ec51df007432e6fe80adb53fd41183716017026af03427846"},
+ {file = "watchdog-2.1.9-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8250546a98388cbc00c3ee3cc5cf96799b5a595270dfcfa855491a64b86ef8c3"},
+ {file = "watchdog-2.1.9-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:117ffc6ec261639a0209a3252546b12800670d4bf5f84fbd355957a0595fe654"},
+ {file = "watchdog-2.1.9-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:97f9752208f5154e9e7b76acc8c4f5a58801b338de2af14e7e181ee3b28a5d39"},
+ {file = "watchdog-2.1.9-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:247dcf1df956daa24828bfea5a138d0e7a7c98b1a47cf1fa5b0c3c16241fcbb7"},
+ {file = "watchdog-2.1.9-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:226b3c6c468ce72051a4c15a4cc2ef317c32590d82ba0b330403cafd98a62cfd"},
+ {file = "watchdog-2.1.9-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:d9820fe47c20c13e3c9dd544d3706a2a26c02b2b43c993b62fcd8011bcc0adb3"},
+ {file = "watchdog-2.1.9-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:70af927aa1613ded6a68089a9262a009fbdf819f46d09c1a908d4b36e1ba2b2d"},
+ {file = "watchdog-2.1.9-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ed80a1628cee19f5cfc6bb74e173f1b4189eb532e705e2a13e3250312a62e0c9"},
+ {file = "watchdog-2.1.9-py3-none-manylinux2014_aarch64.whl", hash = "sha256:9f05a5f7c12452f6a27203f76779ae3f46fa30f1dd833037ea8cbc2887c60213"},
+ {file = "watchdog-2.1.9-py3-none-manylinux2014_armv7l.whl", hash = "sha256:255bb5758f7e89b1a13c05a5bceccec2219f8995a3a4c4d6968fe1de6a3b2892"},
+ {file = "watchdog-2.1.9-py3-none-manylinux2014_i686.whl", hash = "sha256:d3dda00aca282b26194bdd0adec21e4c21e916956d972369359ba63ade616153"},
+ {file = "watchdog-2.1.9-py3-none-manylinux2014_ppc64.whl", hash = "sha256:186f6c55abc5e03872ae14c2f294a153ec7292f807af99f57611acc8caa75306"},
+ {file = "watchdog-2.1.9-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:083171652584e1b8829581f965b9b7723ca5f9a2cd7e20271edf264cfd7c1412"},
+ {file = "watchdog-2.1.9-py3-none-manylinux2014_s390x.whl", hash = "sha256:b530ae007a5f5d50b7fbba96634c7ee21abec70dc3e7f0233339c81943848dc1"},
+ {file = "watchdog-2.1.9-py3-none-manylinux2014_x86_64.whl", hash = "sha256:4f4e1c4aa54fb86316a62a87b3378c025e228178d55481d30d857c6c438897d6"},
+ {file = "watchdog-2.1.9-py3-none-win32.whl", hash = "sha256:5952135968519e2447a01875a6f5fc8c03190b24d14ee52b0f4b1682259520b1"},
+ {file = "watchdog-2.1.9-py3-none-win_amd64.whl", hash = "sha256:7a833211f49143c3d336729b0020ffd1274078e94b0ae42e22f596999f50279c"},
+ {file = "watchdog-2.1.9-py3-none-win_ia64.whl", hash = "sha256:ad576a565260d8f99d97f2e64b0f97a48228317095908568a9d5c786c829d428"},
+ {file = "watchdog-2.1.9.tar.gz", hash = "sha256:43ce20ebb36a51f21fa376f76d1d4692452b2527ccd601950d69ed36b9e21609"},
]
webencodings = [
{file = "webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78"},
@@ -4229,10 +3679,6 @@ werkzeug = [
{file = "Werkzeug-2.0.2-py3-none-any.whl", hash = "sha256:63d3dc1cf60e7b7e35e97fa9861f7397283b75d765afcaefd993d6046899de8f"},
{file = "Werkzeug-2.0.2.tar.gz", hash = "sha256:aa2bb6fc8dee8d6c504c0ac1e7f5f7dc5810a9903e793b6f715a9f015bdadb9a"},
]
-widgetsnbextension = [
- {file = "widgetsnbextension-3.5.2-py2.py3-none-any.whl", hash = "sha256:763a9fdc836d141fa080005a886d63f66f73d56dba1fb5961afc239c77708569"},
- {file = "widgetsnbextension-3.5.2.tar.gz", hash = "sha256:e0731a60ba540cd19bbbefe771a9076dcd2dde90713a8f87f27f53f2d1db7727"},
-]
xxhash = [
{file = "xxhash-2.0.2-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:dac3b94881b943bbe418f5829128b9c48f69a66f816ef8b72ee0129d676dbd7c"},
{file = "xxhash-2.0.2-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:43fd97f332bd581639bb99fe8f09f7e9113d49cad4d21bef0620867f92c802c6"},
diff --git a/pyproject.toml b/pyproject.toml
index fe0dc67b..73a776a9 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -14,7 +14,7 @@ documentation = "https://baal.readthedocs.io"
repository = "https://github.com/ElementAI/baal/"
[tool.poetry.dependencies]
-python = ">=3.7,<3.10"
+python = ">=3.7.1,<4"
torch = ">=1.6.0"
h5py = "^3.4.0"
numpy = "^1.21.2"
@@ -40,24 +40,17 @@ hypothesis = "4.24.0"
flake8 = "^3.9.2"
pytest-mock = "^3.6.1"
black = "^21.8b0"
+mypy = "^0.910"
+bandit = "^1.7.1"
# Documentation
-Sphinx = ">2"
-sphinx-rtd-theme = "^0.5.2"
-asteroid-sphinx-theme = "^0.0.3"
-jupyter-sphinx = "^0.3.2"
-Pygments = ">=2.6.1"
-nbsphinx = "^0.8.7"
-sphinx-automodapi = "^0.13"
-sphinx-copybutton = "^0.4.0"
-numpydoc = "^1.1.0"
docutils = "0.16"
-recommonmark = "^0.7.1"
-mypy = "^0.910"
-bandit = "^1.7.1"
+
# Lightning
lightning-flash = {git = "https://github.com/PyTorchLightning/lightning-flash.git", extras = ["image"]}
+mkdocs-jupyter = "^0.21.0"
+mkdocs-material = "^8.3.6"
[tool.poetry.extras]
vision = ["torchvision"]
From 4db203131eceec300eabe019631cc0cc93d34db4 Mon Sep 17 00:00:00 2001
From: Dref360
Date: Sun, 26 Jun 2022 16:28:25 -0400
Subject: [PATCH 02/10] Update content
---
docs/Makefile | 21 +-
docs/_templates/layout.html | 6 -
docs/api/bayesian.md | 11 +-
docs/api/calibration.md | 7 +-
docs/api/compatibility/huggingface.md | 11 +-
docs/api/compatibility/pytorch-lightning.md | 14 +-
docs/api/dataset_management.md | 14 +-
docs/api/heuristics.md | 15 +-
docs/api/index.md | 33 +-
docs/api/modelwrapper.md | 7 +-
docs/api/utils.md | 7 +-
docs/index.md | 50 +-
docs/industry/index.md | 0
docs/javascripts/mathjax.js | 16 +
.../dirichlet_calibration.md | 98 +-
.../double_descent.md} | 72 +-
docs/{blog => research}/images/ALL_active.png | Bin
.../images/BALDvsCBALD_active.png | Bin
.../{blog => research}/images/CBALDvsBALD.png | Bin
.../images/CBALDvsBALDECE.png | Bin
.../images/EntvsCEnt_active.png | Bin
.../images/dirichlet_calib.png | Bin
.../images/doubledescend_01.png | Bin
.../images/doubledescend_02.png | Bin
.../images/doubledescend_03.png | Bin
.../images/doubledescend_04.png | Bin
.../literature/Additional papers/duq.md | 12 +-
.../Additional papers/lightcoresets.md | 12 +-
.../Additional papers/sparse_selection.md | 10 +-
docs/research/literature/index.md | 4 +-
docs/support/faq.md | 8 +-
docs/support/index.md | 7 +
docs/tutorials/index.md | 16 +
docs/{industry => tutorials}/label-studio.md | 0
docs/user_guide/baal_cheatsheet.md | 23 +-
docs/user_guide/index.md | 7 +-
mkdocs.yml | 60 +-
notebooks/active_learning_process.ipynb | 869 ++-----------
notebooks/deep_ensemble.ipynb | 1115 +----------------
notebooks/fairness/ActiveFairness.ipynb | 84 +-
notebooks/fundamentals/active-learning.ipynb | 108 +-
notebooks/fundamentals/posteriors.ipynb | 141 ++-
poetry.lock | 139 +-
pyproject.toml | 4 +
44 files changed, 785 insertions(+), 2216 deletions(-)
delete mode 100644 docs/_templates/layout.html
delete mode 100644 docs/industry/index.md
create mode 100644 docs/javascripts/mathjax.js
rename docs/{blog => research}/dirichlet_calibration.md (60%)
rename docs/{blog/double_descend.md => research/double_descent.md} (81%)
rename docs/{blog => research}/images/ALL_active.png (100%)
rename docs/{blog => research}/images/BALDvsCBALD_active.png (100%)
rename docs/{blog => research}/images/CBALDvsBALD.png (100%)
rename docs/{blog => research}/images/CBALDvsBALDECE.png (100%)
rename docs/{blog => research}/images/EntvsCEnt_active.png (100%)
rename docs/{blog => research}/images/dirichlet_calib.png (100%)
rename docs/{blog => research}/images/doubledescend_01.png (100%)
rename docs/{blog => research}/images/doubledescend_02.png (100%)
rename docs/{blog => research}/images/doubledescend_03.png (100%)
rename docs/{blog => research}/images/doubledescend_04.png (100%)
create mode 100644 docs/tutorials/index.md
rename docs/{industry => tutorials}/label-studio.md (100%)
diff --git a/docs/Makefile b/docs/Makefile
index a16605d4..554dcade 100644
--- a/docs/Makefile
+++ b/docs/Makefile
@@ -1,23 +1,10 @@
-# Minimal makefile for Sphinx documentation
+# Minimal makefile for Mkdocs documentation
#
-# You can set these variables from the command line.
-SPHINXOPTS =
-SPHINXBUILD = sphinx-build
-SOURCEDIR = .
-BUILDDIR = _build
-
# Put it first so that "make" without argument is like "make help".
-help:
- @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
-
-.PHONY: help Makefile
-
-# Catch-all target: route all unknown targets to Sphinx using the new
-# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
-%: Makefile
- @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
+build:
+ mkdocs build
server:
- open _build/html/index.html
\ No newline at end of file
+ mkdocs watch
\ No newline at end of file
diff --git a/docs/_templates/layout.html b/docs/_templates/layout.html
deleted file mode 100644
index 462ed4c0..00000000
--- a/docs/_templates/layout.html
+++ /dev/null
@@ -1,6 +0,0 @@
-{% extends "!layout.html" %}
-
-{%- block extrahead %}
-
-
-{% endblock %}
\ No newline at end of file
diff --git a/docs/api/bayesian.md b/docs/api/bayesian.md
index 73441544..e0e7902e 100644
--- a/docs/api/bayesian.md
+++ b/docs/api/bayesian.md
@@ -22,11 +22,10 @@ model = MCDropoutConnectModule(model, layers=["Linear"], weight_dropout=0.5)
## API
-```eval_rst
-.. autoclass:: baal.bayesian.dropout.MCDropoutModule
- :members: __init__
+### baal.bayesian.dropout.MCDropoutModule
-..autoclass:: baal.bayesian.weight_drop.MCDropoutConnectModule
- :members: __init__
+::: baal.bayesian.dropout.MCDropoutModule
-```
\ No newline at end of file
+### baal.bayesian.weight_drop.MCDropoutConnectModule
+
+::: baal.bayesian.weight_drop.MCDropoutConnectModule
\ No newline at end of file
diff --git a/docs/api/calibration.md b/docs/api/calibration.md
index 6501afa3..5ba08511 100644
--- a/docs/api/calibration.md
+++ b/docs/api/calibration.md
@@ -1,6 +1,5 @@
# Calibration Wrapper
-```eval_rst
-.. autoclass:: baal.calibration.DirichletCalibrator
- :members:
-```
+### baal.calibration.DirichletCalibrator
+
+::: baal.calibration.DirichletCalibrator
diff --git a/docs/api/compatibility/huggingface.md b/docs/api/compatibility/huggingface.md
index 3d1e6b6b..757667f6 100644
--- a/docs/api/compatibility/huggingface.md
+++ b/docs/api/compatibility/huggingface.md
@@ -1,10 +1,7 @@
## HuggingFace Compatibility
- ```eval_rst
-.. autoclass:: baal.transformers_trainer_wrapper.BaalTransformersTrainer
- :members: predict_on_dataset, predict_on_dataset_generator
+**baal.transformers_trainer_wrapper.BaalTransformersTrainer**
+::: baal.transformers_trainer_wrapper.BaalTransformersTrainer
-.. autoclass:: baal.active.nlp_datasets.HuggingFaceDatasets
- :members:
-
-```
\ No newline at end of file
+**baal.active.dataset.nlp_datasets.HuggingFaceDatasets**
+::: baal.active.dataset.nlp_datasets.HuggingFaceDatasets
\ No newline at end of file
diff --git a/docs/api/compatibility/pytorch-lightning.md b/docs/api/compatibility/pytorch-lightning.md
index f1fffb4d..d9f5a27e 100644
--- a/docs/api/compatibility/pytorch-lightning.md
+++ b/docs/api/compatibility/pytorch-lightning.md
@@ -1,12 +1,10 @@
## Pytorch Lightning Compatibility
- ```eval_rst
-.. autoclass:: baal.utils.pytorch_lightning.ResetCallback
- :members: on_train_start
+**baal.utils.pytorch_lightning.ResetCallback**
+::: baal.utils.pytorch_lightning.ResetCallback
-.. autoclass:: baal.utils.pytorch_lightning.BaalTrainer
- :members: predict_on_dataset, predict_on_dataset_generator
+**baal.utils.pytorch_lightning.BaalTrainer**
+::: baal.utils.pytorch_lightning.BaalTrainer
-.. autoclass:: baal.utils.pytorch_lightning.BaaLDataModule
- :members: pool_dataloader
-```
\ No newline at end of file
+**baal.utils.pytorch_lightning.BaaLDataModule**
+::: baal.utils.pytorch_lightning.BaaLDataModule
\ No newline at end of file
diff --git a/docs/api/dataset_management.md b/docs/api/dataset_management.md
index 3cb76b7f..da7df591 100644
--- a/docs/api/dataset_management.md
+++ b/docs/api/dataset_management.md
@@ -36,13 +36,11 @@ assert al_dataset.pool.transform is None
### API
-```eval_rst
-.. autoclass:: baal.active.ActiveLearningDataset
- :members:
+### baal.active.ActiveLearningDataset
+::: baal.active.ActiveLearningDataset
-.. autoclass:: baal.active.ActiveLearningLoop
- :members:
+### baal.active.ActiveLearningLoop
+::: baal.active.ActiveLearningLoop
-.. autoclass:: baal.active.FileDataset
- :members:
-```
\ No newline at end of file
+### baal.active.FileDataset
+::: baal.active.FileDataset
\ No newline at end of file
diff --git a/docs/api/heuristics.md b/docs/api/heuristics.md
index 635847cb..8172e239 100644
--- a/docs/api/heuristics.md
+++ b/docs/api/heuristics.md
@@ -33,13 +33,14 @@ BALD(reduction="mean")
### API
-```eval_rst
-.. autoclass:: baal.active.heuristics.AbstractHeuristic
- :members:
+### baal.active.heuristics.AbstractHeuristic
+::: baal.active.heuristics.AbstractHeuristic
-.. autoclass:: baal.active.heuristics.BALD
+### baal.active.heuristics.BALD
+::: baal.active.heuristics.BALD
-.. autoclass:: baal.active.heuristics.Random
+### baal.active.heuristics.Random
+::: baal.active.heuristics.Random
-.. autoclass:: baal.active.heuristics.Entropy
-```
\ No newline at end of file
+### baal.active.heuristics.Entropy
+::: baal.active.heuristics.Entropy
\ No newline at end of file
diff --git a/docs/api/index.md b/docs/api/index.md
index 3d28b609..1c5464d5 100644
--- a/docs/api/index.md
+++ b/docs/api/index.md
@@ -1,25 +1,18 @@
# API Reference
-```eval_rst
-.. toctree::
- :caption: API Definition
- :maxdepth: 1
-
- baal.modelwrapper.ModelWrapper <./modelwrapper>
- baal.bayesian <./bayesian>
- baal.active <./dataset_management>
- baal.active.heuristics <./heuristics>
- baal.calibration <./calibration>
- baal.utils <./utils>
-
-.. toctree::
- :caption: Compatibility
- :maxdepth: 1
-
- baal.utils.pytorch_lightning <./compatibility/pytorch-lightning>
- baal.transformers_trainer_wrapper <./compatibility/huggingface>
-
-```
+### :material-file-tree: API Definition
+
+* [baal.modelwrapper.ModelWrapper](./modelwrapper.md)
+* [baal.bayesian](./bayesian.md)
+* [baal.active](./dataset_management.md)
+* [baal.active.heuristics](./heuristics.md)
+* [baal.calibration](./calibration.md)
+* [baal.utils](./utils.md)
+
+### :material-file-tree: Compatibility
+
+* [baal.utils.pytorch_lightning] (./compatibility/pytorch-lightning)
+* [baal.transformers_trainer_wrapper](./compatibility/huggingface)
diff --git a/docs/api/modelwrapper.md b/docs/api/modelwrapper.md
index e8ef4eeb..72811d32 100644
--- a/docs/api/modelwrapper.md
+++ b/docs/api/modelwrapper.md
@@ -32,7 +32,6 @@ predictions.shape
### API
-```eval_rst
-.. autoclass:: baal.ModelWrapper
- :members:
-```
\ No newline at end of file
+### baal.ModelWrapper
+
+::: baal.ModelWrapper
\ No newline at end of file
diff --git a/docs/api/utils.md b/docs/api/utils.md
index 36643632..06073998 100644
--- a/docs/api/utils.md
+++ b/docs/api/utils.md
@@ -5,7 +5,6 @@
To work with `baal.modelwrapper.ModelWrapper`, we provide `Metrics`.
-```eval_rst
-.. automodule:: baal.utils.metrics
- :members:
-```
\ No newline at end of file
+### baal.utils.metrics
+
+::: baal.utils.metrics
\ No newline at end of file
diff --git a/docs/index.md b/docs/index.md
index 12cbc38e..5bcdac01 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -13,50 +13,6 @@ If you have any question, we are reachable on [Slack](https://join.slack.com/t/b
For support, we have several ways to help you:
-* Our [FAQ](support/faq.md)
-* Submit an issue on Github [here](https://github.com/ElementAI/baal/issues/new/choose)
-* Join our [Slack](https://join.slack.com/t/baal-world/shared_invite/zt-z0izhn4y-Jt6Zu5dZaV2rsAS9sdISfg)!
-
-## :material-file-tree: Learn more about Baal
-
-* [:material-link: User Guide](user_guide)
-* [:material-book-education: Active learning dataset and training loop classes](notebooks/fundamentals/active-learning.ipynb)
-* [:material-book-education: Methods for approximating bayesian posteriors](notebooks/fundamentals/posteriors.ipynb)
-* [:material-link: API Index](api)
-* [:material-help: FAQ](support/faq.md)
-
-## :material-file-tree: Industry
-* [:material-book-education: Active learning dataset and training loop classes](notebooks/fundamentals/active-learning.ipynb)
-
-.. toctree ::
- :caption: Tutorials
- :maxdepth: 1
-
- How to use BaaL with Label Studio
- How to do research and plot progress
- How to use in production
- How to use deep ensembles
-
-.. toctree ::
- :caption: Compatibility with other libraries
- :maxdepth: 1
-
- How to use with Pytorch Lightning
- How to use with HuggingFace
- How to use with Scikit-Learn
-
-.. toctree ::
- :caption: Technical Reports
- :maxdepth: 1
-
- Combining calibration and variational inference for active learning
- Double descend in active learning
- Can active learning mitigate bias in datasets
-
-.. toctree::
- :caption: Literature and support
- :maxdepth: 2
-
- Background literature
- Cheat Sheet
-```
+* Our [:material-help: FAQ](support/faq.md)
+* Submit an issue on Github [here](https://github.com/baal-org/baal/issues/new/choose)
+* Join our [:material-slack: Slack](https://join.slack.com/t/baal-world/shared_invite/zt-z0izhn4y-Jt6Zu5dZaV2rsAS9sdISfg)!
diff --git a/docs/industry/index.md b/docs/industry/index.md
deleted file mode 100644
index e69de29b..00000000
diff --git a/docs/javascripts/mathjax.js b/docs/javascripts/mathjax.js
new file mode 100644
index 00000000..06dbf38b
--- /dev/null
+++ b/docs/javascripts/mathjax.js
@@ -0,0 +1,16 @@
+window.MathJax = {
+ tex: {
+ inlineMath: [["\\(", "\\)"]],
+ displayMath: [["\\[", "\\]"]],
+ processEscapes: true,
+ processEnvironments: true
+ },
+ options: {
+ ignoreHtmlClass: ".*|",
+ processHtmlClass: "arithmatex"
+ }
+};
+
+document$.subscribe(() => {
+ MathJax.typesetPromise()
+})
diff --git a/docs/blog/dirichlet_calibration.md b/docs/research/dirichlet_calibration.md
similarity index 60%
rename from docs/blog/dirichlet_calibration.md
rename to docs/research/dirichlet_calibration.md
index 368c7509..0cd46019 100644
--- a/docs/blog/dirichlet_calibration.md
+++ b/docs/research/dirichlet_calibration.md
@@ -4,15 +4,12 @@ A [paper recently published at NeurIPS 2019](https://dirichletcal.github.io/) pr
To achieve that, they add a new linear layer at the end of the network and train it individually on a held-out set.
-Here is a figure from the authors' NeurIPS 2019 presentation. You can find the full presention on the website above.
-
-```eval_rst
-.. figure:: images/dirichlet_calib.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-```
+Here is a figure from the authors' NeurIPS 2019 presentation. You can find the full presentation on the website above.
+
+
+![](./images/dirichlet_calib.png){ width="500" }
+ Dirichlet Calibration NeurIPS 2019
+
Our hypothesis is as follows: by modelling the uncertainty on an held-out set, we want to create a better estimation of the overall uncertainty.
@@ -24,15 +21,15 @@ Current SotA methods for active learning rely on VI to estimate the model uncert
## Methodology
-Our methodology follows a standard active learning pipeline, but we add a new training set :math:`D_{L}` which is used to train the calibration layer. After training the model on the training set :math:`D_{train}` to convergence, we train it on this held-out set and train the newly added layer.
+Our methodology follows a standard active learning pipeline, but we add a new training set $D_{L}$ which is used to train the calibration layer. After training the model on the training set $D_{train}$ to convergence, we train it on this held-out set and train the newly added layer.
-We call the augmented model :math:`M_{calib}`. We perform the sample selection using one of the following techniques:
+We call the augmented model $M_{calib}$. We perform the sample selection using one of the following techniques:
-* Entropy: :math:`\sum_c p_i \log(p_i)`
-* BALD using MC-Dropout: :math:`H[y \mid x, D_{L}] - E_{p(w \mid D_L)}(H[y \mid x, w])`
+* Entropy: $\sum_c p_i \log(p_i)$
+* BALD using MC-Dropout: $H[y \mid x, D_{L}] - E_{p(w \mid D_L)}(H[y \mid x, w])$
* Uniform random selection
-Because we want to analyze the actual gain of using calibration, we compare the effect of using :math:`M` versus :math:`M_{calib}` across all techniques.
+Because we want to analyze the actual gain of using calibration, we compare the effect of using $M$ versus $M_{calib}$ across all techniques.
## Experiments
@@ -42,70 +39,51 @@ We test our hypothesis on CIFAR10 using a VGG-16. We initially label 1000 sample
We first want to ensure that calibration works properly. In Fig. 2, we show that throughout the active learning procedure, the calibrated loss is better than the non-calibrated loss.
-```eval_rst
-.. figure:: images/CBALDvsBALD.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Comparison between the calibrated loss and the uncalibrated loss.
-```
+
+![](./images/CBALDvsBALD.png){ width="500" align="center"}
+ Comparison between the calibrated loss and the uncalibrated loss.
+
+
Furthermore, we compute the ECE between both cases.
-```eval_rst
-.. figure:: images/CBALDvsBALDECE.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Comparison between ECE for both Calibrated BALD and BALD.
-```
+
+![](./images/CBALDvsBALDECE.png){ width="500" align="center"}
+ Comparison between ECE for both Calibrated BALD and BALD.
+
### Impact of calibration on active learning
For each method, we present the calibrated NLL at each active learning step.
-We want to compare the selection process between :math:`M` and :math:`M_{calib}`.
+We want to compare the selection process between $M$ and $M_{calib}$.
Our reasoning is as follow. We want to see if the calibrated model would pick better items over the normal one.
-To do so we make two experiments, one where we use :math:`M` to select the new samples and the other uses :math:`M_{calib}`.
+To do so we make two experiments, one where we use $M$ to select the new samples and the other uses $M_{calib}$.
In both cases, we will get a calibrated model to compare the calibrated loss.
-```eval_rst
-.. figure:: images/BALDvsCBALD_active.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Comparison between a calibrated selector and an uncalibrated one using BALD.
-
-
-.. figure:: images/EntvsCEnt_active.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Comparison between a calibrated selector and an uncalibrated one using Entropy.
+
+![](images/BALDvsCBALD_active.png){ width="500"}
+ Comparison between a calibrated selector and an uncalibrated one using BALD.
+
+
+
+![](images/EntvsCEnt_active.png){ width="500"}
+ Comparison between a calibrated selector and an uncalibrated one using Entropy.
+
+
+
+![](images/ALL_active.png){ width="500" }
+ Comparison between calibrated selectors.
+
+
In addition, we show that BALD is still better in all cases.
-.. figure:: images/ALL_active.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Comparison between calibrated selectors.
-```
## Discussion
-While we have not seen improvments by using calibration on an active learning benchmark, we still find this report useful. Active learning is but a part of the Human-ai-interaction (HAII) process. By adding an easy to use calibration method, we can further the collaboration between the human and our model.
+While we have not seen improvements by using calibration on an active learning benchmark, we still find this report useful. Active learning is but a part of the Human-ai-interaction (HAII) process. By adding an easy to use calibration method, we can further the collaboration between the human and our model.
By giving more nuanced predictions, the model is deemed more trustable by the human annotator.
diff --git a/docs/blog/double_descend.md b/docs/research/double_descent.md
similarity index 81%
rename from docs/blog/double_descend.md
rename to docs/research/double_descent.md
index 0fc9548e..9cd81963 100644
--- a/docs/blog/double_descend.md
+++ b/docs/research/double_descent.md
@@ -40,47 +40,26 @@ We ran 4 categories of experiments:
Dataset: CIFAR10
Model: Vgg16 trained on imagenet
-```eval_rst
-.. figure:: images/doubledescend_03.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Using early stopping and reset the weights of the linear layers after each active learning step.
-```
-
-
-```eval_rst
-.. figure:: images/doubledescend_04.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Using early stopping and reset all the weights after each active learning step.
-```
-
-
-```eval_rst
-.. figure:: images/doubledescend_02.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Overfitting the training set and reset the weights of the linear layers after each active learning step.
-```
-
-```eval_rst
-.. figure:: images/doubledescend_01.png
- :width: 400px
- :height: 200px
- :alt: alternate text
- :align: center
-
- Overfitting the training set and reset all the weights after each active learning step.
-```
+
+![](images/doubledescend_03.png){ width="500" }
+ Using early stopping and reset the weights of the linear layers after each active learning step.
+
+
+
+![](images/doubledescend_04.png){ width="500" }
+ Using early stopping and reset all the weights after each active learning step.
+
+
+
+![](images/doubledescend_02.png){ width="500" }
+ Overfitting the training set and reset the weights of the linear layers after each active learning step.
+
+
+
+![](images/doubledescend_01.png){ width="500" }
+ Overfitting the training set and reset all the weights after each active learning step.
+
+
In the first two experiments, if we are using early stopping, the partial reset will provoke a double descent. A closer
look in the second diagram shows that although in the case of fully resetting the model weights, we can prevent the
@@ -95,12 +74,11 @@ with a negligible peak. Moreover, letting the model train well before performing
key to encourage smooth training, we show the difference between letting the model to train for 10 epochs vs 5 epochs
before adding samples to the labelled set.
-```eval_rst
-NOTE: In the case of not using early stopping, `p` is used to show the number of epochs we train the model before
-estimating uncertainties and increase the labelled set.
-All in all, not using early stopping and fully resetting the model weights i.e. the last graph, could certify a smooth
-training procedure without being worried about other elements such as weight decay.
-```
+!!! note
+ In the case of not using early stopping, `p` is used to show the number of epochs we train the model before
+ estimating uncertainties and increase the labelled set.
+ All in all, not using early stopping and fully resetting the model weights i.e. the last graph, could certify a smooth
+ training procedure without being worried about other elements such as weight decay.
### Our Hypothesis
diff --git a/docs/blog/images/ALL_active.png b/docs/research/images/ALL_active.png
similarity index 100%
rename from docs/blog/images/ALL_active.png
rename to docs/research/images/ALL_active.png
diff --git a/docs/blog/images/BALDvsCBALD_active.png b/docs/research/images/BALDvsCBALD_active.png
similarity index 100%
rename from docs/blog/images/BALDvsCBALD_active.png
rename to docs/research/images/BALDvsCBALD_active.png
diff --git a/docs/blog/images/CBALDvsBALD.png b/docs/research/images/CBALDvsBALD.png
similarity index 100%
rename from docs/blog/images/CBALDvsBALD.png
rename to docs/research/images/CBALDvsBALD.png
diff --git a/docs/blog/images/CBALDvsBALDECE.png b/docs/research/images/CBALDvsBALDECE.png
similarity index 100%
rename from docs/blog/images/CBALDvsBALDECE.png
rename to docs/research/images/CBALDvsBALDECE.png
diff --git a/docs/blog/images/EntvsCEnt_active.png b/docs/research/images/EntvsCEnt_active.png
similarity index 100%
rename from docs/blog/images/EntvsCEnt_active.png
rename to docs/research/images/EntvsCEnt_active.png
diff --git a/docs/blog/images/dirichlet_calib.png b/docs/research/images/dirichlet_calib.png
similarity index 100%
rename from docs/blog/images/dirichlet_calib.png
rename to docs/research/images/dirichlet_calib.png
diff --git a/docs/blog/images/doubledescend_01.png b/docs/research/images/doubledescend_01.png
similarity index 100%
rename from docs/blog/images/doubledescend_01.png
rename to docs/research/images/doubledescend_01.png
diff --git a/docs/blog/images/doubledescend_02.png b/docs/research/images/doubledescend_02.png
similarity index 100%
rename from docs/blog/images/doubledescend_02.png
rename to docs/research/images/doubledescend_02.png
diff --git a/docs/blog/images/doubledescend_03.png b/docs/research/images/doubledescend_03.png
similarity index 100%
rename from docs/blog/images/doubledescend_03.png
rename to docs/research/images/doubledescend_03.png
diff --git a/docs/blog/images/doubledescend_04.png b/docs/research/images/doubledescend_04.png
similarity index 100%
rename from docs/blog/images/doubledescend_04.png
rename to docs/research/images/doubledescend_04.png
diff --git a/docs/research/literature/Additional papers/duq.md b/docs/research/literature/Additional papers/duq.md
index e7219528..206e118d 100644
--- a/docs/research/literature/Additional papers/duq.md
+++ b/docs/research/literature/Additional papers/duq.md
@@ -17,22 +17,22 @@ DUQ uses a RBF Network to compute centroids for each class. The model is trained
For a model f, a centroid matrix W and a centroid e, we compute the similarity using a RBF kernel. Theta is a hyper parameter.
-``$`K_c(f_\theta, e_c) = exp(-\frac{\frac{1}{n}\mid \mid W_cf_\theta(x) - e_c\mid\mid^2_2}{2\sigma^2})`$``
+$K_c(f_\theta, e_c) = exp(-\frac{\frac{1}{n}\mid \mid W_cf_\theta(x) - e_c\mid\mid^2_2}{2\sigma^2})$
with this similarity we can make a prediction by selecting the centroid with the highest similarity.
The loss function is now simply
-``$`L(x,y) = - \sum_c y_clog(K_c) + (1 - y_c)log(1-K_c)`$``,
+$L(x,y) = - \sum_c y_clog(K_c) + (1 - y_c)log(1-K_c)$,
-where ``$`K_c(f_\theta, e_c)=K_c`$``
+where $K_c(f_\theta, e_c)=K_c$
After each batch, we update the centroid matrix using an exponential moving average.
### Regularization
-To avoid feature collapse, the authors introduce a gradient penalty directly applied to ``$`K_c`$``:
-``$`\lambda* (\mid\mid \nabla_x \sum_c K_c\mid\mid^2_2 - 1)^2`$``
-where 1 is the Lipschitz constant. In their experiments, they use ``$`\lambda=0.05`$``.
+To avoid feature collapse, the authors introduce a gradient penalty directly applied to $K_c$:
+$\lambda* (\mid\mid \nabla_x \sum_c K_c\mid\mid^2_2 - 1)^2$
+where 1 is the Lipschitz constant. In their experiments, they use $\lambda=0.05$.
In summary, this simple technique is faster and better than ensembles. It also shows that RBF networks work on large datasets.
diff --git a/docs/research/literature/Additional papers/lightcoresets.md b/docs/research/literature/Additional papers/lightcoresets.md
index a1108ef0..7d9b4d32 100644
--- a/docs/research/literature/Additional papers/lightcoresets.md
+++ b/docs/research/literature/Additional papers/lightcoresets.md
@@ -6,16 +6,16 @@
This paper presents a novel Coreset algorithm called *Light Coreset*.
-Let ``$`X`$`` be the dataset, ``$`d`$`` a distance function and ``$`\mu(X)`$`` the mean of the dataset per feature.
+Let $X$ be the dataset, $d$ a distance function and $\mu(X)$ the mean of the dataset per feature.
-We compute the distribution ``$`q`$``with:
+We compute the distribution $q$with:
-``$`q(x) = 0.5 * \frac{1}{\vert X \vert} + 0.5 * \frac{d(x, \mu(X))^2}{\sum_{x' \in X} d(x', \mu(X))^2}`$``,
-where ``$`x \in X`$``.
+$q(x) = 0.5 * \frac{1}{\vert X \vert} + 0.5 * \frac{d(x, \mu(X))^2}{\sum_{x' \in X} d(x', \mu(X))^2}$,
+where $x \in X$.
-We can then select ``$`m`$`` samples by sampling from this distribution. For their experiments, they used the L2 distance for *d*.
+We can then select $m$ samples by sampling from this distribution. For their experiments, they used the L2 distance for *d*.
-Let A be the first part of the equation ``$`q`$`` and B the second. The authors offers the following explanation :
+Let A be the first part of the equation $q$ and B the second. The authors offers the following explanation :
>The first component (A) is the uniform distribution and ensures
that all points are sampled with nonzero probability. The second
diff --git a/docs/research/literature/Additional papers/sparse_selection.md b/docs/research/literature/Additional papers/sparse_selection.md
index 0e029c58..91244022 100644
--- a/docs/research/literature/Additional papers/sparse_selection.md
+++ b/docs/research/literature/Additional papers/sparse_selection.md
@@ -11,16 +11,16 @@ Published at NeurIPS 2019
A known issue of BALD, when used in Batch Active Learning is that it selects highly correlated samples.
By combining BNNs with a novel coreset algorithm, the authors propose a way to estimate the true posterior data distribution.
-In brief, they want to select a batch ``$`D'`$`` such that the posterior distribution best approximate the complete data posterior.
+In brief, they want to select a batch $D'$ such that the posterior distribution best approximate the complete data posterior.
Because we do not know the complete posterior, the authors approximate it using the predictive distribution. The idea is summarized in Eq. 4.
![](../images/sparse_selection/eq4.png)
-This measure can be optimized using Frank-Wolfe which uses the dot-product ``$`\langle L_m, L_n\rangle`$`` to estimate the affectations.
+This measure can be optimized using Frank-Wolfe which uses the dot-product $\langle L_m, L_n\rangle$ to estimate the affectations.
-While a closed-form procedure exists to compute this dot-product, it is expensive to run (``$`O(||P||^2)`$``).
-The authors suggest the use of random projections drawn from the parameters distribution ``$`\hat\pi`$``.
-This approximation makes the algorithm ``$`O(||P||J)`$``, where J is the number of samples drawn from ``$`\hat\pi`$``.
+While a closed-form procedure exists to compute this dot-product, it is expensive to run ($O(||P||^2)$).
+The authors suggest the use of random projections drawn from the parameters distribution $\hat\pi$.
+This approximation makes the algorithm $O(||P||J)$, where J is the number of samples drawn from $\hat\pi$.
diff --git a/docs/research/literature/index.md b/docs/research/literature/index.md
index 325468b8..e68741fc 100644
--- a/docs/research/literature/index.md
+++ b/docs/research/literature/index.md
@@ -4,8 +4,8 @@ This page is here to collect summaries of papers that focus on active learning.
The idea is to share knowledge on recent developments in active learning.
If you've read a paper recently, write a little summary in markdown, put it in
-the folder `docs/literature` and make a pull request. You can even do all of
-that right in the github web UI!
+the folder `docs/research/literature` and make a pull request. You can even do all of
+that right in the Github web UI!
```eval_rst
.. toctree::
diff --git a/docs/support/faq.md b/docs/support/faq.md
index 58d99618..5f72acbd 100644
--- a/docs/support/faq.md
+++ b/docs/support/faq.md
@@ -103,11 +103,11 @@ al_dataset.label_randomly(10)
pool = al_dataset.pool
```
-From a rigorous point of view: ``$`D = ds `$`` , ``$`D_L=al\_dataset `$`` and ``$`D_U = D \setminus D_L = pool `$``.
-Then, we train our model on ``$`D_L `$`` and compute the uncertainty on ``$`D_U `$``. The most uncertains samples are
-labelled and added to ``$`D_L `$``, removed from ``$`D_U `$``.
+From a rigorous point of view: $D = ds $ , $D_L=al\_dataset $ and $D_U = D \setminus D_L = pool $.
+Then, we train our model on $D_L $ and compute the uncertainty on $D_U $. The most uncertains samples are
+labelled and added to $D_L $, removed from $D_U $.
-Let a method `query_human` performs the annotations, we can label our dataset using indices relative to``$`D_U `$``.
+Let a method `query_human` performs the annotations, we can label our dataset using indices relative to$D_U $.
This assumes that your dataset class `YourDataset` has a method named `label` which has the following
definition: `def label(self, idx, value)` where we give the label for index `idx`. There the index is not relative to
the pool, so you don't have to worry about it.
diff --git a/docs/support/index.md b/docs/support/index.md
index 852c875b..20d656be 100644
--- a/docs/support/index.md
+++ b/docs/support/index.md
@@ -1,2 +1,9 @@
# Support
+For support, we have several ways to help you:
+
+* Our [:material-help: FAQ](faq.md)
+* Submit an issue on Github [here](https://github.com/baal-org/baal/issues/new/choose)
+* Join our [:material-slack: Slack](https://join.slack.com/t/baal-world/shared_invite/zt-z0izhn4y-Jt6Zu5dZaV2rsAS9sdISfg)!
+ * General questions can be asked under the #questions channel
+
\ No newline at end of file
diff --git a/docs/tutorials/index.md b/docs/tutorials/index.md
new file mode 100644
index 00000000..82e70532
--- /dev/null
+++ b/docs/tutorials/index.md
@@ -0,0 +1,16 @@
+# Tutorials
+
+Tutorials are split in two sections, "How-to" and "Compatibility". The first one focuses on Baal's capabilities and the
+latter on how we integrate with other common frameworks such as [Label Studio], [HuggingFace] or [Lightning Flash].
+
+## :material-file-tree: How to
+
+* Run an active learning experiments
+* Active learning in production
+* Deep Ensembles
+
+## :material-file-tree: Compatibility
+
+* [:material-link: Lightning Flash](https://devblog.pytorchlightning.ai/active-learning-made-simple-using-flash-and-baal-2216df6f872c)
+* [HuggingFace](../notebooks/compatibility/nlp_classification.ipynb)
+* [Scikit-Learn](../notebooks/compatibility/sklearn_tutorial.ipynb)
\ No newline at end of file
diff --git a/docs/industry/label-studio.md b/docs/tutorials/label-studio.md
similarity index 100%
rename from docs/industry/label-studio.md
rename to docs/tutorials/label-studio.md
diff --git a/docs/user_guide/baal_cheatsheet.md b/docs/user_guide/baal_cheatsheet.md
index 57975ca6..c7200fba 100644
--- a/docs/user_guide/baal_cheatsheet.md
+++ b/docs/user_guide/baal_cheatsheet.md
@@ -7,9 +7,9 @@ In the table below, we have a mapping between common equations and the BaaL API.
Here are the types for all variables needed.
```python
-model : torch.nn.Module
-wrapper : baal.ModelWrapper
-dataset: torch.utils.data_utils.Dataset
+model: torch.nn.Module
+wrapper: baal.ModelWrapper
+dataset: torch.utils.data_utils.Dataset
bald = baal.active.heuristics.BALD()
entropy = baal.active.heuristics.Entropy()
```
@@ -18,17 +18,12 @@ We assume that `baal.bayesian.dropout.patch_module` has been applied to the mode
`model = baal.bayesian.dropout.patch_module(model)`
-```eval_rst
-.. csv-table:: BaaL cheat sheet
- :header: "Description", "Equation", "BaaL"
- :widths: 20, 20, 40
-
- "Bayesian Model Averaging", ":math:`\hat{T} = p(y \mid x, {\cal D})= \int p(y \mid x, \theta)p(\theta \mid D) d\theta`", "`wrapper.predict_on_dataset(dataset, batch_size=B, iterations=I, use_cuda=True).mean(-1)`"
- "MC-Dropout", ":math:`T = \{p(y\mid x_j, \theta_i)\} \mid x_j \in {\cal D}' ,i \in \{1, \ldots, I\}`", "`wrapper.predict_on_dataset(dataset, batch_size=B, iterations=I, use_cuda=True)`"
- "BALD", ":math:`{\cal I}[y, \theta \mid x, {\cal D}] = {\cal H}[y \mid x, {\cal D}] - {\cal E}_{p(\theta \mid {\cal D})}[{\cal H}[y \mid x, \theta]]`", "`bald.get_uncertainties(T)`"
- "Entropy", ":math:`\sum_c \hat{T}_c \log(\hat{T}_c)`", "`entropy.get_uncertainties(T)`"
-
-```
+| Description | Equation | BaaL |
+|--------------------------|----------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|
+| Bayesian Model Averaging | $\hat{T} = p(y \mid x, {\cal D})= \int p(y \mid x, \theta)p(\theta \mid D) d\theta$ | `wrapper.predict_on_dataset(dataset, batch_size=B, iterations=I, use_cuda=True).mean(-1)` |
+| MC-Dropout | $T = \{p(y\mid x_j, \theta_i)\} \mid x_j \in {\cal D}' ,i \in \{1, \ldots, I\}$ | `wrapper.predict_on_dataset(dataset, batch_size=B, iterations=I, use_cuda=True)` |
+| BALD | ${\cal I}[y, \theta \mid x, {\cal D}] = {\cal H}[y \mid x, {\cal D}] - {\cal E}_{p(\theta \mid {\cal D})}[{\cal H}[y \mid x, \theta]]$ | `bald.get_uncertainties(T)` |
+| Entropy | $\sum_c \hat{T}_c \log(\hat{T}_c)$ | `entropy.get_uncertainties(T)` |
**Contributing**
diff --git a/docs/user_guide/index.md b/docs/user_guide/index.md
index 4fc2a264..bc27f9e7 100644
--- a/docs/user_guide/index.md
+++ b/docs/user_guide/index.md
@@ -6,9 +6,9 @@ In addition, we propose a [cheat sheet](./baal_cheatsheet.md) that will help use
### Notations and glossary
-* Training dataset ``$`D_L`$``
-* Pool, the unlabelled portion of the dataset ``$`D_U`$``
-* Heuristic, the function that computes the uncertainty (ex. BALD) ``$`U `$``
+* Training dataset $D_L$
+* Pool, the unlabelled portion of the dataset $D_U$
+* Heuristic, the function that computes the uncertainty (ex. BALD) $U$
* Active learning step, the sequence of training, selecting and labelling one or many examples.
* BALD, an heuristic that works well with deep learning models that are overconfident.
* Query size, the number of items to label between retraining.
@@ -60,6 +60,7 @@ We hope that work in this area continues so that we can better understand the im
**References**
+
* Kirsch, Andreas, Joost Van Amersfoort, and Yarin Gal. "Batchbald: Efficient and diverse batch acquisition for deep bayesian active learning." NeurIPS (2019).
* Jain, Siddhartha, Ge Liu, and David Gifford. "Information Condensing Active Learning." arXiv preprint arXiv:2002.07916 (2020).
* Houlsby, Neil, et al. "Bayesian active learning for classification and preference learning." arXiv preprint arXiv:1112.5745 (2011).
diff --git a/mkdocs.yml b/mkdocs.yml
index 6b6a4c26..8073c11d 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -21,30 +21,82 @@ theme:
name: Switch to light mode
features:
- navigation.tabs
+ - navigation.tabs.sticky
- navigation.indexes
- navigation.instant
icon:
repo: fontawesome/brands/github
plugins:
- search
+ - mkdocs-jupyter
+ - mkdocstrings
markdown_extensions:
+ - md_in_html
- attr_list
+ - pymdownx.arithmatex:
+ generic: true
- pymdownx.emoji:
emoji_index: !!python/name:materialx.emoji.twemoji
emoji_generator: !!python/name:materialx.emoji.to_svg
+ - pymdownx.highlight:
+ anchor_linenums: true
+ - pymdownx.inlinehilite
+ - pymdownx.snippets
+ - pymdownx.superfences
+
+extra_javascript:
+ - javascripts/mathjax.js
+ - https://polyfill.io/v3/polyfill.min.js?features=es6
+ - https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js
nav:
- Home: index.md
- User Guide:
- user_guide/index.md
+ - Cheat Sheet: user_guide/baal_cheatsheet.md
+ - Active data structure: notebooks/fundamentals/active-learning.ipynb
+ - Computing uncertainty:
+ - Stochastic models: notebooks/fundamentals/active-learning.ipynb
+ - Heuristics: user_guide/heuristics.md
- API:
- api/index.md
- - Production:
- - industry/index.md
+ - api/bayesian.md
+ - api/calibration.md
+ - api/dataset_management.md
+ - api/heuristics.md
+ - api/modelwrapper.md
+ - api/utils.md
+ - Compatibility:
+ - api/compatibility/huggingface.md
+ - api/compatibility/pytorch-lightning.md
+
+ - Tutorials:
+ - tutorials/index.md
+ - Compatibility:
+ - tutorials/label-studio.md
+ - notebooks/compatibility/nlp_classification.ipynb
+ - notebooks/compatibility/sklearn_tutorial.ipynb
+ - notebooks/baal_prod_cls.ipynb
+ - notebooks/deep_ensemble.ipynb
- Research:
- research/index.md
- - Blog:
- - blog/index.md
+ - Technical Reports:
+ - notebooks/fairness/ActiveFairness.ipynb
+ - research/dirichlet_calibration.md
+ - research/double_descent.md
+ - Literature:
+ - research/literature/index.md
+ - research/literature/core-papers.md
+ - Additional papers:
+ - research/literature/more_papers.md
+ - research/literature/Additional papers/dmi.md
+ - research/literature/Additional papers/duq.md
+ - research/literature/Additional papers/gyolov3.md
+ - research/literature/Additional papers/lightcoresets.md
+ - research/literature/Additional papers/sparse_selection.md
+ - research/literature/Additional papers/vaal.md
+
- Support:
- support/index.md
+ - support/faq.md
diff --git a/notebooks/active_learning_process.ipynb b/notebooks/active_learning_process.ipynb
index 9faf3f2e..37b91082 100644
--- a/notebooks/active_learning_process.ipynb
+++ b/notebooks/active_learning_process.ipynb
@@ -2,11 +2,15 @@
"cells": [
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"# How to do research and visualize progress\n",
"\n",
- "In this tutorial, we will show how to use BaaL for research ie. when we know the labels.\n",
+ "In this tutorial, we will show how to use Baal for research ie. when we know the labels.\n",
"We will introduce notions such as dataset management, MC-Dropout, BALD. If you need more documentation, be sure to check our **Additional resources** section below!\n",
"\n",
"BaaL can be used on a variety of research domains:\n",
@@ -38,7 +42,11 @@
{
"cell_type": "code",
"execution_count": 1,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"# Let's start with a bunch of imports.\n",
@@ -73,7 +81,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"### Dataset management and the pool\n",
"\n",
@@ -111,7 +123,11 @@
{
"cell_type": "code",
"execution_count": 2,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"\"\"\"\n",
@@ -139,7 +155,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"Here we define our Experiment configuration, this can come from your favorite experiment manager like MLFlow.\n",
"BaaL does not expect a particular format as all arguments are supplied."
@@ -148,7 +168,11 @@
{
"cell_type": "code",
"execution_count": 3,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"\n",
@@ -183,7 +207,11 @@
{
"cell_type": "code",
"execution_count": 4,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"def get_datasets(initial_pool):\n",
@@ -261,7 +289,11 @@
{
"cell_type": "code",
"execution_count": 5,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stdout",
@@ -316,7 +348,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"### What is an active learning loop\n",
"\n",
@@ -329,775 +365,14 @@
},
{
"cell_type": "code",
- "execution_count": 6,
- "metadata": {},
- "outputs": [
- {
- "data": {
- "application/vnd.jupyter.widget-view+json": {
- "model_id": "db80f856c34647a1a8e129a84339a57f",
- "version_major": 2,
- "version_minor": 0
- },
- "text/plain": [
- " 0%| | 0/200 [00:00, ?it/s]"
- ]
- },
- "metadata": {},
- "output_type": "display_data"
- },
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:26:02.888118Z [\u001b[32minfo ] Starting training dataset=512 epoch=10\n"
- ]
- },
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "/opt/conda/lib/python3.9/site-packages/torch/utils/data/dataloader.py:478: UserWarning: This DataLoader will create 4 worker processes in total. Our suggested max number of worker in current system is 1, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.\n",
- " warnings.warn(_create_warning_msg(\n",
- "/opt/conda/lib/python3.9/site-packages/torch/nn/functional.py:718: UserWarning: Named tensors and all their associated APIs are an experimental feature and subject to change. Please do not use them for anything important until they are released as stable. (Triggered internally at /pytorch/c10/core/TensorImpl.h:1156.)\n",
- " return torch.max_pool2d(input, kernel_size, stride, padding, dilation, ceil_mode)\n"
- ]
- },
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:26:32.784181Z [\u001b[32minfo ] Training complete train_loss=0.42513248324394226\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:26:32.785716Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:26:42.387235Z [\u001b[32minfo ] Evaluation complete test_loss=0.5483408570289612\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:26:42.391419Z [\u001b[32minfo ] Start Predict dataset=14488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:28:48.686742Z [\u001b[32minfo ] Starting training dataset=612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:29:18.082733Z [\u001b[32minfo ] Training complete train_loss=0.023272458463907242\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:29:18.084489Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:29:26.288090Z [\u001b[32minfo ] Evaluation complete test_loss=1.0466769933700562\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:29:26.292233Z [\u001b[32minfo ] Start Predict dataset=14388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:31:33.634294Z [\u001b[32minfo ] Starting training dataset=712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:32:06.180206Z [\u001b[32minfo ] Training complete train_loss=0.020062478259205818\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:32:06.181728Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:32:14.669412Z [\u001b[32minfo ] Evaluation complete test_loss=1.234800934791565\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:32:14.673486Z [\u001b[32minfo ] Start Predict dataset=14288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:34:19.535935Z [\u001b[32minfo ] Starting training dataset=812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:34:50.487032Z [\u001b[32minfo ] Training complete train_loss=0.0782688781619072\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:34:50.488655Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:34:58.867665Z [\u001b[32minfo ] Evaluation complete test_loss=0.9648405909538269\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:34:58.871468Z [\u001b[32minfo ] Start Predict dataset=14188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:37:04.037450Z [\u001b[32minfo ] Starting training dataset=912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:37:37.071740Z [\u001b[32minfo ] Training complete train_loss=0.006023803725838661\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:37:37.073237Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:37:45.168673Z [\u001b[32minfo ] Evaluation complete test_loss=0.8979674577713013\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:37:45.173268Z [\u001b[32minfo ] Start Predict dataset=14088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:39:49.231978Z [\u001b[32minfo ] Starting training dataset=1012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:40:22.371968Z [\u001b[32minfo ] Training complete train_loss=0.015347965992987156\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:40:22.373994Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:40:30.474273Z [\u001b[32minfo ] Evaluation complete test_loss=0.8754228353500366\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:40:30.478465Z [\u001b[32minfo ] Start Predict dataset=13988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:42:36.883231Z [\u001b[32minfo ] Starting training dataset=1112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:43:12.986473Z [\u001b[32minfo ] Training complete train_loss=0.008938436396420002\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:43:12.988493Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:43:21.974697Z [\u001b[32minfo ] Evaluation complete test_loss=0.8416990041732788\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:43:21.979689Z [\u001b[32minfo ] Start Predict dataset=13888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:45:28.336859Z [\u001b[32minfo ] Starting training dataset=1212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:46:06.676436Z [\u001b[32minfo ] Training complete train_loss=0.006746976636350155\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:46:06.678271Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:46:15.267152Z [\u001b[32minfo ] Evaluation complete test_loss=0.8873146772384644\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:46:15.271275Z [\u001b[32minfo ] Start Predict dataset=13788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:48:20.279303Z [\u001b[32minfo ] Starting training dataset=1312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:48:59.773418Z [\u001b[32minfo ] Training complete train_loss=0.07147088646888733\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:48:59.775274Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:49:08.279979Z [\u001b[32minfo ] Evaluation complete test_loss=0.6853619813919067\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:49:08.284169Z [\u001b[32minfo ] Start Predict dataset=13688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:51:13.170250Z [\u001b[32minfo ] Starting training dataset=1412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:51:53.273587Z [\u001b[32minfo ] Training complete train_loss=0.04359261691570282\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:51:53.275475Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:52:01.677753Z [\u001b[32minfo ] Evaluation complete test_loss=0.6789661645889282\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:52:01.681596Z [\u001b[32minfo ] Start Predict dataset=13588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:54:03.351257Z [\u001b[32minfo ] Starting training dataset=1512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:54:44.773633Z [\u001b[32minfo ] Training complete train_loss=0.018604231998324394\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:54:44.776034Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:54:53.375403Z [\u001b[32minfo ] Evaluation complete test_loss=0.7287857532501221\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:54:53.380431Z [\u001b[32minfo ] Start Predict dataset=13488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:56:55.411753Z [\u001b[32minfo ] Starting training dataset=1612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T15:57:37.783421Z [\u001b[32minfo ] Training complete train_loss=0.03406292200088501\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T15:57:37.785087Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T15:57:46.284654Z [\u001b[32minfo ] Evaluation complete test_loss=0.638004720211029\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T15:57:46.288731Z [\u001b[32minfo ] Start Predict dataset=13388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T15:59:43.650554Z [\u001b[32minfo ] Starting training dataset=1712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:00:25.788374Z [\u001b[32minfo ] Training complete train_loss=0.06721857935190201\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:00:25.789980Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:00:34.270598Z [\u001b[32minfo ] Evaluation complete test_loss=0.6078959703445435\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:00:34.274553Z [\u001b[32minfo ] Start Predict dataset=13288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:02:29.738611Z [\u001b[32minfo ] Starting training dataset=1812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:03:13.372754Z [\u001b[32minfo ] Training complete train_loss=0.08680642396211624\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:03:13.374516Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:03:21.776862Z [\u001b[32minfo ] Evaluation complete test_loss=0.647302508354187\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:03:21.780676Z [\u001b[32minfo ] Start Predict dataset=13188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:05:17.095705Z [\u001b[32minfo ] Starting training dataset=1912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:06:01.375814Z [\u001b[32minfo ] Training complete train_loss=0.06293369829654694\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:06:01.377432Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:06:09.786188Z [\u001b[32minfo ] Evaluation complete test_loss=0.6817241311073303\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:06:09.789944Z [\u001b[32minfo ] Start Predict dataset=13088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:08:04.612386Z [\u001b[32minfo ] Starting training dataset=2012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:08:49.982937Z [\u001b[32minfo ] Training complete train_loss=0.012322206981480122\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:08:49.984675Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:08:58.488292Z [\u001b[32minfo ] Evaluation complete test_loss=0.8289020657539368\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:08:58.566936Z [\u001b[32minfo ] Start Predict dataset=12988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:10:53.208150Z [\u001b[32minfo ] Starting training dataset=2112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:11:40.481402Z [\u001b[32minfo ] Training complete train_loss=0.06632529199123383\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:11:40.483683Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:11:48.892025Z [\u001b[32minfo ] Evaluation complete test_loss=0.6253312826156616\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:11:48.896544Z [\u001b[32minfo ] Start Predict dataset=12888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:13:44.590686Z [\u001b[32minfo ] Starting training dataset=2212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:14:31.585109Z [\u001b[32minfo ] Training complete train_loss=0.044168129563331604\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:14:31.587383Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:14:40.077616Z [\u001b[32minfo ] Evaluation complete test_loss=0.6782843470573425\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:14:40.081708Z [\u001b[32minfo ] Start Predict dataset=12788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:16:33.938059Z [\u001b[32minfo ] Starting training dataset=2312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:17:21.669894Z [\u001b[32minfo ] Training complete train_loss=0.09880666434764862\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:17:21.671646Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:17:30.173757Z [\u001b[32minfo ] Evaluation complete test_loss=0.8312588930130005\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:17:30.177913Z [\u001b[32minfo ] Start Predict dataset=12688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:19:20.962003Z [\u001b[32minfo ] Starting training dataset=2412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:20:10.175347Z [\u001b[32minfo ] Training complete train_loss=0.049882616847753525\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:20:10.177008Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:20:18.676877Z [\u001b[32minfo ] Evaluation complete test_loss=0.6991893649101257\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:20:18.681326Z [\u001b[32minfo ] Start Predict dataset=12588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:22:10.999132Z [\u001b[32minfo ] Starting training dataset=2512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:23:00.569287Z [\u001b[32minfo ] Training complete train_loss=0.06398215144872665\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:23:00.571141Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:23:08.785616Z [\u001b[32minfo ] Evaluation complete test_loss=0.5477628111839294\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:23:08.789326Z [\u001b[32minfo ] Start Predict dataset=12488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:24:58.624758Z [\u001b[32minfo ] Starting training dataset=2612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:25:49.383363Z [\u001b[32minfo ] Training complete train_loss=0.046333637088537216\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:25:49.385026Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:25:57.787980Z [\u001b[32minfo ] Evaluation complete test_loss=0.702488124370575\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:25:57.792133Z [\u001b[32minfo ] Start Predict dataset=12388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:27:47.319856Z [\u001b[32minfo ] Starting training dataset=2712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:28:39.489135Z [\u001b[32minfo ] Training complete train_loss=0.08484052121639252\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:28:39.490987Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:28:47.771346Z [\u001b[32minfo ] Evaluation complete test_loss=0.5731009840965271\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:28:47.775165Z [\u001b[32minfo ] Start Predict dataset=12288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:30:36.802687Z [\u001b[32minfo ] Starting training dataset=2812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:31:29.576259Z [\u001b[32minfo ] Training complete train_loss=0.09145867079496384\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:31:29.578028Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:31:37.886296Z [\u001b[32minfo ] Evaluation complete test_loss=0.5549673438072205\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:31:37.889800Z [\u001b[32minfo ] Start Predict dataset=12188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:33:26.398167Z [\u001b[32minfo ] Starting training dataset=2912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:34:20.490228Z [\u001b[32minfo ] Training complete train_loss=0.02744719199836254\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:34:20.492170Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:34:29.067964Z [\u001b[32minfo ] Evaluation complete test_loss=0.660302996635437\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:34:29.072246Z [\u001b[32minfo ] Start Predict dataset=12088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:36:15.329656Z [\u001b[32minfo ] Starting training dataset=3012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:37:09.966144Z [\u001b[32minfo ] Training complete train_loss=0.06737153232097626\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:37:09.968091Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:37:18.489054Z [\u001b[32minfo ] Evaluation complete test_loss=0.621569812297821\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:37:18.569881Z [\u001b[32minfo ] Start Predict dataset=11988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:39:04.909076Z [\u001b[32minfo ] Starting training dataset=3112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:40:01.971488Z [\u001b[32minfo ] Training complete train_loss=0.05985158681869507\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:40:01.973623Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:40:10.988141Z [\u001b[32minfo ] Evaluation complete test_loss=0.6705137491226196\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:40:10.992964Z [\u001b[32minfo ] Start Predict dataset=11888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:42:03.615463Z [\u001b[32minfo ] Starting training dataset=3212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:43:06.187299Z [\u001b[32minfo ] Training complete train_loss=0.06435515731573105\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:43:06.189039Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:43:14.974582Z [\u001b[32minfo ] Evaluation complete test_loss=0.6966602206230164\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:43:14.978799Z [\u001b[32minfo ] Start Predict dataset=11788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:45:01.591964Z [\u001b[32minfo ] Starting training dataset=3312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:45:59.571342Z [\u001b[32minfo ] Training complete train_loss=0.05543559044599533\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:45:59.572948Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:46:07.784239Z [\u001b[32minfo ] Evaluation complete test_loss=0.6278331279754639\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:46:07.788230Z [\u001b[32minfo ] Start Predict dataset=11688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:47:52.487984Z [\u001b[32minfo ] Starting training dataset=3412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:48:52.691644Z [\u001b[32minfo ] Training complete train_loss=0.07221610099077225\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:48:52.765384Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:49:01.280861Z [\u001b[32minfo ] Evaluation complete test_loss=0.6179820895195007\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:49:01.285400Z [\u001b[32minfo ] Start Predict dataset=11588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:50:44.974180Z [\u001b[32minfo ] Starting training dataset=3512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:51:42.676914Z [\u001b[32minfo ] Training complete train_loss=0.039833199232816696\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:51:42.678530Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:51:51.080616Z [\u001b[32minfo ] Evaluation complete test_loss=0.6217177510261536\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:51:51.084222Z [\u001b[32minfo ] Start Predict dataset=11488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:53:31.595579Z [\u001b[32minfo ] Starting training dataset=3612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:54:29.772139Z [\u001b[32minfo ] Training complete train_loss=0.03375746309757233\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:54:29.774393Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:54:37.979455Z [\u001b[32minfo ] Evaluation complete test_loss=0.6929616928100586\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:54:37.982888Z [\u001b[32minfo ] Start Predict dataset=11388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:56:16.550993Z [\u001b[32minfo ] Starting training dataset=3712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T16:57:15.780976Z [\u001b[32minfo ] Training complete train_loss=0.04057781398296356\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T16:57:15.782596Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T16:57:23.974826Z [\u001b[32minfo ] Evaluation complete test_loss=0.7048872113227844\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T16:57:23.978437Z [\u001b[32minfo ] Start Predict dataset=11288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T16:59:04.231574Z [\u001b[32minfo ] Starting training dataset=3812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:00:07.275913Z [\u001b[32minfo ] Training complete train_loss=0.07070793211460114\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:00:07.277627Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:00:15.779335Z [\u001b[32minfo ] Evaluation complete test_loss=0.5328732132911682\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:00:15.782809Z [\u001b[32minfo ] Start Predict dataset=11188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:01:54.765307Z [\u001b[32minfo ] Starting training dataset=3912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:02:58.388761Z [\u001b[32minfo ] Training complete train_loss=0.12230982631444931\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:02:58.391212Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:03:06.876905Z [\u001b[32minfo ] Evaluation complete test_loss=0.5321411490440369\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:03:06.881740Z [\u001b[32minfo ] Start Predict dataset=11088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:04:44.268732Z [\u001b[32minfo ] Starting training dataset=4012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:05:51.968480Z [\u001b[32minfo ] Training complete train_loss=0.044358428567647934\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:05:51.970839Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:06:00.487090Z [\u001b[32minfo ] Evaluation complete test_loss=0.8059483170509338\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:06:00.491321Z [\u001b[32minfo ] Start Predict dataset=10988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:07:37.287176Z [\u001b[32minfo ] Starting training dataset=4112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:08:43.176013Z [\u001b[32minfo ] Training complete train_loss=0.1225663274526596\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:08:43.177778Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:08:51.476998Z [\u001b[32minfo ] Evaluation complete test_loss=0.4315877854824066\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:08:51.481084Z [\u001b[32minfo ] Start Predict dataset=10888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:10:29.940602Z [\u001b[32minfo ] Starting training dataset=4212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:11:37.180729Z [\u001b[32minfo ] Training complete train_loss=0.10635881125926971\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:11:37.182797Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:11:45.678586Z [\u001b[32minfo ] Evaluation complete test_loss=0.5154958963394165\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:11:45.682656Z [\u001b[32minfo ] Start Predict dataset=10788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:13:21.169707Z [\u001b[32minfo ] Starting training dataset=4312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:14:29.792534Z [\u001b[32minfo ] Training complete train_loss=0.053799740970134735\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:14:29.866135Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:14:38.169583Z [\u001b[32minfo ] Evaluation complete test_loss=0.6495267748832703\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:14:38.173309Z [\u001b[32minfo ] Start Predict dataset=10688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:16:11.983602Z [\u001b[32minfo ] Starting training dataset=4412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:17:20.387089Z [\u001b[32minfo ] Training complete train_loss=0.018582936376333237\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:17:20.388534Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:17:28.788662Z [\u001b[32minfo ] Evaluation complete test_loss=0.7638018131256104\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:17:28.867651Z [\u001b[32minfo ] Start Predict dataset=10588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:19:02.263938Z [\u001b[32minfo ] Starting training dataset=4512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:20:11.585626Z [\u001b[32minfo ] Training complete train_loss=0.047023482620716095\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:20:11.587573Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:20:19.881492Z [\u001b[32minfo ] Evaluation complete test_loss=0.49579280614852905\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:20:19.885845Z [\u001b[32minfo ] Start Predict dataset=10488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:21:54.652841Z [\u001b[32minfo ] Starting training dataset=4612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:23:06.479348Z [\u001b[32minfo ] Training complete train_loss=0.06473588943481445\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:23:06.480989Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:23:14.966838Z [\u001b[32minfo ] Evaluation complete test_loss=0.6052875518798828\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:23:14.970796Z [\u001b[32minfo ] Start Predict dataset=10388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:24:47.734368Z [\u001b[32minfo ] Starting training dataset=4712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:25:59.879686Z [\u001b[32minfo ] Training complete train_loss=0.04754060506820679\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:25:59.881753Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:26:08.370846Z [\u001b[32minfo ] Evaluation complete test_loss=0.6196796298027039\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:26:08.374766Z [\u001b[32minfo ] Start Predict dataset=10288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:27:38.912284Z [\u001b[32minfo ] Starting training dataset=4812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:28:56.790148Z [\u001b[32minfo ] Training complete train_loss=0.13738520443439484\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:28:56.791933Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:29:05.372354Z [\u001b[32minfo ] Evaluation complete test_loss=0.417594313621521\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:29:05.376664Z [\u001b[32minfo ] Start Predict dataset=10188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:30:37.433346Z [\u001b[32minfo ] Starting training dataset=4912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:31:52.394380Z [\u001b[32minfo ] Training complete train_loss=0.08568105101585388\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:31:52.396245Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:32:00.787110Z [\u001b[32minfo ] Evaluation complete test_loss=0.48676493763923645\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:32:00.791032Z [\u001b[32minfo ] Start Predict dataset=10088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:33:28.340993Z [\u001b[32minfo ] Starting training dataset=5012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:34:43.667946Z [\u001b[32minfo ] Training complete train_loss=0.036210086196660995\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:34:43.670010Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:34:52.185600Z [\u001b[32minfo ] Evaluation complete test_loss=0.6417835354804993\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:34:52.189976Z [\u001b[32minfo ] Start Predict dataset=9988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:36:20.994496Z [\u001b[32minfo ] Starting training dataset=5112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:37:36.585937Z [\u001b[32minfo ] Training complete train_loss=0.055751074105501175\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:37:36.587822Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:37:44.981738Z [\u001b[32minfo ] Evaluation complete test_loss=0.5641336441040039\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:37:44.986213Z [\u001b[32minfo ] Start Predict dataset=9888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:39:15.946557Z [\u001b[32minfo ] Starting training dataset=5212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:40:33.074149Z [\u001b[32minfo ] Training complete train_loss=0.04473729059100151\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:40:33.075860Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:40:41.380948Z [\u001b[32minfo ] Evaluation complete test_loss=0.5987882614135742\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:40:41.384711Z [\u001b[32minfo ] Start Predict dataset=9788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:42:07.956223Z [\u001b[32minfo ] Starting training dataset=5312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:43:26.188865Z [\u001b[32minfo ] Training complete train_loss=0.04242468625307083\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:43:26.190898Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:43:34.776365Z [\u001b[32minfo ] Evaluation complete test_loss=0.573499858379364\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:43:34.780809Z [\u001b[32minfo ] Start Predict dataset=9688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:45:02.444034Z [\u001b[32minfo ] Starting training dataset=5412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:46:22.986454Z [\u001b[32minfo ] Training complete train_loss=0.05522795766592026\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:46:22.988322Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:46:31.871482Z [\u001b[32minfo ] Evaluation complete test_loss=0.5418797731399536\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:46:31.876347Z [\u001b[32minfo ] Start Predict dataset=9588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:47:57.958922Z [\u001b[32minfo ] Starting training dataset=5512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:49:19.288858Z [\u001b[32minfo ] Training complete train_loss=0.05585930123925209\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:49:19.290700Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:49:27.887083Z [\u001b[32minfo ] Evaluation complete test_loss=0.5781568884849548\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:49:27.891558Z [\u001b[32minfo ] Start Predict dataset=9488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:50:51.759392Z [\u001b[32minfo ] Starting training dataset=5612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:52:10.167108Z [\u001b[32minfo ] Training complete train_loss=0.08212323486804962\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:52:10.168639Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:52:18.489575Z [\u001b[32minfo ] Evaluation complete test_loss=0.5190374255180359\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:52:18.568302Z [\u001b[32minfo ] Start Predict dataset=9388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:53:42.390100Z [\u001b[32minfo ] Starting training dataset=5712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:55:06.677886Z [\u001b[32minfo ] Training complete train_loss=0.08772403001785278\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:55:06.679728Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:55:15.179280Z [\u001b[32minfo ] Evaluation complete test_loss=0.4441128075122833\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:55:15.183333Z [\u001b[32minfo ] Start Predict dataset=9288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:56:40.256626Z [\u001b[32minfo ] Starting training dataset=5812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T17:58:00.871617Z [\u001b[32minfo ] Training complete train_loss=0.05669952929019928\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T17:58:00.873122Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T17:58:09.284316Z [\u001b[32minfo ] Evaluation complete test_loss=0.5060445666313171\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T17:58:09.288729Z [\u001b[32minfo ] Start Predict dataset=9188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T17:59:28.919979Z [\u001b[32minfo ] Starting training dataset=5912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:00:51.078877Z [\u001b[32minfo ] Training complete train_loss=0.07387827336788177\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:00:51.080834Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:00:59.669384Z [\u001b[32minfo ] Evaluation complete test_loss=0.5086865425109863\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:00:59.674395Z [\u001b[32minfo ] Start Predict dataset=9088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:02:19.394067Z [\u001b[32minfo ] Starting training dataset=6012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:03:42.782050Z [\u001b[32minfo ] Training complete train_loss=0.08140388131141663\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:03:42.783808Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:03:51.468837Z [\u001b[32minfo ] Evaluation complete test_loss=0.432904988527298\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:03:51.473174Z [\u001b[32minfo ] Start Predict dataset=8988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:05:10.688109Z [\u001b[32minfo ] Starting training dataset=6112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:06:36.281515Z [\u001b[32minfo ] Training complete train_loss=0.04091908782720566\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:06:36.283262Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:06:44.686541Z [\u001b[32minfo ] Evaluation complete test_loss=0.5736671686172485\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:06:44.690767Z [\u001b[32minfo ] Start Predict dataset=8888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:08:05.198629Z [\u001b[32minfo ] Starting training dataset=6212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:09:30.569787Z [\u001b[32minfo ] Training complete train_loss=0.06559653580188751\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:09:30.571482Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:09:38.668353Z [\u001b[32minfo ] Evaluation complete test_loss=0.5123801827430725\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:09:38.672007Z [\u001b[32minfo ] Start Predict dataset=8788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:10:56.504718Z [\u001b[32minfo ] Starting training dataset=6312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:12:24.391821Z [\u001b[32minfo ] Training complete train_loss=0.07125910371541977\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:12:24.393844Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:12:32.869289Z [\u001b[32minfo ] Evaluation complete test_loss=0.4685639441013336\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:12:32.873654Z [\u001b[32minfo ] Start Predict dataset=8688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:13:52.161712Z [\u001b[32minfo ] Starting training dataset=6412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:15:19.480747Z [\u001b[32minfo ] Training complete train_loss=0.08134432137012482\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:15:19.482431Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:15:27.767816Z [\u001b[32minfo ] Evaluation complete test_loss=0.4484506845474243\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:15:27.771722Z [\u001b[32minfo ] Start Predict dataset=8588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:16:45.635661Z [\u001b[32minfo ] Starting training dataset=6512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:18:13.391296Z [\u001b[32minfo ] Training complete train_loss=0.07426474988460541\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:18:13.392929Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:18:22.003691Z [\u001b[32minfo ] Evaluation complete test_loss=0.5092469453811646\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:18:22.007843Z [\u001b[32minfo ] Start Predict dataset=8488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:19:37.445279Z [\u001b[32minfo ] Starting training dataset=6612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:21:05.781014Z [\u001b[32minfo ] Training complete train_loss=0.05355135723948479\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:21:05.782790Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:21:14.180182Z [\u001b[32minfo ] Evaluation complete test_loss=0.504236102104187\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:21:14.184058Z [\u001b[32minfo ] Start Predict dataset=8388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:22:27.520906Z [\u001b[32minfo ] Starting training dataset=6712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:24:01.288497Z [\u001b[32minfo ] Training complete train_loss=0.06972303986549377\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:24:01.290234Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:24:09.581470Z [\u001b[32minfo ] Evaluation complete test_loss=0.505696177482605\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:24:09.585144Z [\u001b[32minfo ] Start Predict dataset=8288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:25:23.082250Z [\u001b[32minfo ] Starting training dataset=6812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:26:51.691389Z [\u001b[32minfo ] Training complete train_loss=0.08558610081672668\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:26:51.693177Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:27:00.171906Z [\u001b[32minfo ] Evaluation complete test_loss=0.4490826725959778\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:27:00.175724Z [\u001b[32minfo ] Start Predict dataset=8188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:28:12.777510Z [\u001b[32minfo ] Starting training dataset=6912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:29:44.587334Z [\u001b[32minfo ] Training complete train_loss=0.027112364768981934\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:29:44.589016Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:29:52.967716Z [\u001b[32minfo ] Evaluation complete test_loss=0.6299618482589722\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:29:52.971769Z [\u001b[32minfo ] Start Predict dataset=8088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:31:03.915833Z [\u001b[32minfo ] Starting training dataset=7012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:32:36.978128Z [\u001b[32minfo ] Training complete train_loss=0.05840374156832695\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:32:36.979875Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:32:45.379434Z [\u001b[32minfo ] Evaluation complete test_loss=0.49078628420829773\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:32:45.384069Z [\u001b[32minfo ] Start Predict dataset=7988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:33:57.892025Z [\u001b[32minfo ] Starting training dataset=7112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:35:31.374373Z [\u001b[32minfo ] Training complete train_loss=0.07398192584514618\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:35:31.375994Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:35:39.775096Z [\u001b[32minfo ] Evaluation complete test_loss=0.3967265784740448\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:35:39.779247Z [\u001b[32minfo ] Start Predict dataset=7888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:36:49.517415Z [\u001b[32minfo ] Starting training dataset=7212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:38:24.174877Z [\u001b[32minfo ] Training complete train_loss=0.04612201824784279\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:38:24.177022Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:38:32.379501Z [\u001b[32minfo ] Evaluation complete test_loss=0.49809715151786804\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:38:32.382998Z [\u001b[32minfo ] Start Predict dataset=7788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:39:41.917608Z [\u001b[32minfo ] Starting training dataset=7312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:41:17.174369Z [\u001b[32minfo ] Training complete train_loss=0.06543444097042084\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:41:17.176491Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:41:25.471666Z [\u001b[32minfo ] Evaluation complete test_loss=0.5225977897644043\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:41:25.475692Z [\u001b[32minfo ] Start Predict dataset=7688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:42:33.522329Z [\u001b[32minfo ] Starting training dataset=7412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:44:09.293539Z [\u001b[32minfo ] Training complete train_loss=0.060686737298965454\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:44:09.366949Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:44:17.887026Z [\u001b[32minfo ] Evaluation complete test_loss=0.5125385522842407\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:44:17.890832Z [\u001b[32minfo ] Start Predict dataset=7588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:45:25.400968Z [\u001b[32minfo ] Starting training dataset=7512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:47:02.366684Z [\u001b[32minfo ] Training complete train_loss=0.07019717246294022\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:47:02.368580Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:47:10.878960Z [\u001b[32minfo ] Evaluation complete test_loss=0.470739483833313\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:47:10.882850Z [\u001b[32minfo ] Start Predict dataset=7488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:48:16.235930Z [\u001b[32minfo ] Starting training dataset=7612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:49:53.493500Z [\u001b[32minfo ] Training complete train_loss=0.04707063362002373\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:49:53.495700Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:50:01.788730Z [\u001b[32minfo ] Evaluation complete test_loss=0.5984643697738647\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:50:01.866775Z [\u001b[32minfo ] Start Predict dataset=7388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:51:07.901960Z [\u001b[32minfo ] Starting training dataset=7712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:52:46.877393Z [\u001b[32minfo ] Training complete train_loss=0.06192445009946823\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:52:46.879872Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:52:55.176958Z [\u001b[32minfo ] Evaluation complete test_loss=0.5132700204849243\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:52:55.181403Z [\u001b[32minfo ] Start Predict dataset=7288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:54:00.257566Z [\u001b[32minfo ] Starting training dataset=7812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:55:43.588375Z [\u001b[32minfo ] Training complete train_loss=0.03979147970676422\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:55:43.590052Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:55:51.875854Z [\u001b[32minfo ] Evaluation complete test_loss=0.5150150656700134\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:55:51.880159Z [\u001b[32minfo ] Start Predict dataset=7188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:56:56.536715Z [\u001b[32minfo ] Starting training dataset=7912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T18:58:41.879868Z [\u001b[32minfo ] Training complete train_loss=0.06223804131150246\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T18:58:41.881963Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T18:58:50.468474Z [\u001b[32minfo ] Evaluation complete test_loss=0.5643148422241211\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T18:58:50.472745Z [\u001b[32minfo ] Start Predict dataset=7088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T18:59:54.295632Z [\u001b[32minfo ] Starting training dataset=8012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:01:35.476312Z [\u001b[32minfo ] Training complete train_loss=0.06737291067838669\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:01:35.478476Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:01:43.780950Z [\u001b[32minfo ] Evaluation complete test_loss=0.48747894167900085\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:01:43.785449Z [\u001b[32minfo ] Start Predict dataset=6988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:02:46.131753Z [\u001b[32minfo ] Starting training dataset=8112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:04:29.880063Z [\u001b[32minfo ] Training complete train_loss=0.05925571545958519\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:04:29.881840Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:04:38.267654Z [\u001b[32minfo ] Evaluation complete test_loss=0.564688503742218\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:04:38.271766Z [\u001b[32minfo ] Start Predict dataset=6888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:05:39.800913Z [\u001b[32minfo ] Starting training dataset=8212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:07:23.486186Z [\u001b[32minfo ] Training complete train_loss=0.040423326194286346\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:07:23.487988Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:07:31.880357Z [\u001b[32minfo ] Evaluation complete test_loss=0.5314301252365112\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:07:31.884041Z [\u001b[32minfo ] Start Predict dataset=6788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:08:32.589051Z [\u001b[32minfo ] Starting training dataset=8312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:10:17.365494Z [\u001b[32minfo ] Training complete train_loss=0.06140904501080513\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:10:17.367863Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:10:25.972279Z [\u001b[32minfo ] Evaluation complete test_loss=0.49688518047332764\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:10:25.976035Z [\u001b[32minfo ] Start Predict dataset=6688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:11:25.835585Z [\u001b[32minfo ] Starting training dataset=8412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:13:10.774985Z [\u001b[32minfo ] Training complete train_loss=0.050612445920705795\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:13:10.776734Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:13:19.167663Z [\u001b[32minfo ] Evaluation complete test_loss=0.5152626037597656\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:13:19.171962Z [\u001b[32minfo ] Start Predict dataset=6588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:14:18.243027Z [\u001b[32minfo ] Starting training dataset=8512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:16:04.791515Z [\u001b[32minfo ] Training complete train_loss=0.033627718687057495\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:16:04.793285Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:16:13.288919Z [\u001b[32minfo ] Evaluation complete test_loss=0.4974973201751709\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:16:13.292998Z [\u001b[32minfo ] Start Predict dataset=6488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:17:11.209628Z [\u001b[32minfo ] Starting training dataset=8612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:18:58.093512Z [\u001b[32minfo ] Training complete train_loss=0.04596696048974991\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:18:58.095249Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:19:06.484257Z [\u001b[32minfo ] Evaluation complete test_loss=0.6129364967346191\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:19:06.488960Z [\u001b[32minfo ] Start Predict dataset=6388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:20:04.209439Z [\u001b[32minfo ] Starting training dataset=8712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:21:55.388171Z [\u001b[32minfo ] Training complete train_loss=0.049339789897203445\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:21:55.390263Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:22:04.078244Z [\u001b[32minfo ] Evaluation complete test_loss=0.5226636528968811\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:22:04.082582Z [\u001b[32minfo ] Start Predict dataset=6288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:23:01.440119Z [\u001b[32minfo ] Starting training dataset=8812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:24:53.994417Z [\u001b[32minfo ] Training complete train_loss=0.061960719525814056\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:24:53.996345Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:25:02.372585Z [\u001b[32minfo ] Evaluation complete test_loss=0.5233380198478699\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:25:02.377398Z [\u001b[32minfo ] Start Predict dataset=6188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:25:58.792316Z [\u001b[32minfo ] Starting training dataset=8912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:27:51.591211Z [\u001b[32minfo ] Training complete train_loss=0.04009644314646721\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:27:51.593404Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:28:00.069251Z [\u001b[32minfo ] Evaluation complete test_loss=0.5073189735412598\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:28:00.073255Z [\u001b[32minfo ] Start Predict dataset=6088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:28:55.614687Z [\u001b[32minfo ] Starting training dataset=9012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:30:48.691304Z [\u001b[32minfo ] Training complete train_loss=0.02636721171438694\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:30:48.693153Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:30:57.194178Z [\u001b[32minfo ] Evaluation complete test_loss=0.4932842552661896\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:30:57.198696Z [\u001b[32minfo ] Start Predict dataset=5988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:31:52.014806Z [\u001b[32minfo ] Starting training dataset=9112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:33:50.286149Z [\u001b[32minfo ] Training complete train_loss=0.038404107093811035\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:33:50.288005Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:33:58.867647Z [\u001b[32minfo ] Evaluation complete test_loss=0.4707241952419281\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:33:58.871875Z [\u001b[32minfo ] Start Predict dataset=5888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:34:52.597191Z [\u001b[32minfo ] Starting training dataset=9212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:36:52.979111Z [\u001b[32minfo ] Training complete train_loss=0.03221401944756508\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:36:52.981332Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:37:01.382679Z [\u001b[32minfo ] Evaluation complete test_loss=0.6017580628395081\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:37:01.386797Z [\u001b[32minfo ] Start Predict dataset=5788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:37:57.489992Z [\u001b[32minfo ] Starting training dataset=9312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:39:55.967846Z [\u001b[32minfo ] Training complete train_loss=0.04899067059159279\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:39:55.969679Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:40:04.272984Z [\u001b[32minfo ] Evaluation complete test_loss=0.4522852897644043\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:40:04.278449Z [\u001b[32minfo ] Start Predict dataset=5688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:40:57.436623Z [\u001b[32minfo ] Starting training dataset=9412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:42:55.093364Z [\u001b[32minfo ] Training complete train_loss=0.0554390586912632\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:42:55.095567Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:43:03.678599Z [\u001b[32minfo ] Evaluation complete test_loss=0.5251901149749756\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:43:03.682851Z [\u001b[32minfo ] Start Predict dataset=5588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:43:54.653131Z [\u001b[32minfo ] Starting training dataset=9512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:45:51.790112Z [\u001b[32minfo ] Training complete train_loss=0.04198170453310013\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:45:51.791730Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:46:00.284703Z [\u001b[32minfo ] Evaluation complete test_loss=0.4602336585521698\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:46:00.289764Z [\u001b[32minfo ] Start Predict dataset=5488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:46:50.358892Z [\u001b[32minfo ] Starting training dataset=9612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:48:46.692007Z [\u001b[32minfo ] Training complete train_loss=0.044020820409059525\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:48:46.693602Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:48:55.168600Z [\u001b[32minfo ] Evaluation complete test_loss=0.508590579032898\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:48:55.172559Z [\u001b[32minfo ] Start Predict dataset=5388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:49:43.776890Z [\u001b[32minfo ] Starting training dataset=9712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:51:42.176822Z [\u001b[32minfo ] Training complete train_loss=0.044634196907281876\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:51:42.179048Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:51:50.672765Z [\u001b[32minfo ] Evaluation complete test_loss=0.45551300048828125\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:51:50.676552Z [\u001b[32minfo ] Start Predict dataset=5288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:52:38.484667Z [\u001b[32minfo ] Starting training dataset=9812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:54:36.177208Z [\u001b[32minfo ] Training complete train_loss=0.04296591877937317\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:54:36.178931Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:54:44.583521Z [\u001b[32minfo ] Evaluation complete test_loss=0.47437164187431335\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:54:44.587728Z [\u001b[32minfo ] Start Predict dataset=5188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:55:31.707641Z [\u001b[32minfo ] Starting training dataset=9912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T19:57:30.279767Z [\u001b[32minfo ] Training complete train_loss=0.03641924262046814\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T19:57:30.281499Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T19:57:38.466727Z [\u001b[32minfo ] Evaluation complete test_loss=0.5371329188346863\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T19:57:38.470612Z [\u001b[32minfo ] Start Predict dataset=5088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T19:58:24.271875Z [\u001b[32minfo ] Starting training dataset=10012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:00:25.895237Z [\u001b[32minfo ] Training complete train_loss=0.03702413663268089\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:00:25.897511Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:00:34.571676Z [\u001b[32minfo ] Evaluation complete test_loss=0.5401384234428406\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:00:34.576397Z [\u001b[32minfo ] Start Predict dataset=4988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:01:22.740200Z [\u001b[32minfo ] Starting training dataset=10112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:03:27.695031Z [\u001b[32minfo ] Training complete train_loss=0.05812685936689377\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:03:27.698130Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:03:36.176022Z [\u001b[32minfo ] Evaluation complete test_loss=0.4845340847969055\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:03:36.179729Z [\u001b[32minfo ] Start Predict dataset=4888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:04:22.320485Z [\u001b[32minfo ] Starting training dataset=10212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:06:27.779220Z [\u001b[32minfo ] Training complete train_loss=0.0416891947388649\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:06:27.781323Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:06:36.067846Z [\u001b[32minfo ] Evaluation complete test_loss=0.46326562762260437\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:06:36.072402Z [\u001b[32minfo ] Start Predict dataset=4788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:07:22.045382Z [\u001b[32minfo ] Starting training dataset=10312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:09:30.292345Z [\u001b[32minfo ] Training complete train_loss=0.041010648012161255\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:09:30.294497Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:09:38.871867Z [\u001b[32minfo ] Evaluation complete test_loss=0.49109208583831787\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:09:38.876577Z [\u001b[32minfo ] Start Predict dataset=4688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:10:24.154535Z [\u001b[32minfo ] Starting training dataset=10412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:12:33.415576Z [\u001b[32minfo ] Training complete train_loss=0.028930526226758957\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:12:33.417436Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:12:42.284002Z [\u001b[32minfo ] Evaluation complete test_loss=0.5107240676879883\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:12:42.288185Z [\u001b[32minfo ] Start Predict dataset=4588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:13:26.235085Z [\u001b[32minfo ] Starting training dataset=10512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:15:35.385270Z [\u001b[32minfo ] Training complete train_loss=0.031576935201883316\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:15:35.387155Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:15:43.779605Z [\u001b[32minfo ] Evaluation complete test_loss=0.6074793338775635\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:15:43.783983Z [\u001b[32minfo ] Start Predict dataset=4488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:16:27.250669Z [\u001b[32minfo ] Starting training dataset=10612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:18:38.077150Z [\u001b[32minfo ] Training complete train_loss=0.025566451251506805\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:18:38.079132Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:18:46.374243Z [\u001b[32minfo ] Evaluation complete test_loss=0.6829275488853455\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:18:46.378420Z [\u001b[32minfo ] Start Predict dataset=4388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:19:29.033297Z [\u001b[32minfo ] Starting training dataset=10712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:21:40.186438Z [\u001b[32minfo ] Training complete train_loss=0.03265538439154625\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:21:40.188513Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:21:48.585218Z [\u001b[32minfo ] Evaluation complete test_loss=0.48324117064476013\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:21:48.589070Z [\u001b[32minfo ] Start Predict dataset=4288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:22:29.946839Z [\u001b[32minfo ] Starting training dataset=10812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:24:41.668771Z [\u001b[32minfo ] Training complete train_loss=0.04195354878902435\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:24:41.670907Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:24:49.971122Z [\u001b[32minfo ] Evaluation complete test_loss=0.5226069688796997\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:24:49.975409Z [\u001b[32minfo ] Start Predict dataset=4188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:25:30.746312Z [\u001b[32minfo ] Starting training dataset=10912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:27:43.090670Z [\u001b[32minfo ] Training complete train_loss=0.03698594123125076\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:27:43.092609Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:27:51.571422Z [\u001b[32minfo ] Evaluation complete test_loss=0.5863118767738342\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:27:51.576126Z [\u001b[32minfo ] Start Predict dataset=4088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:28:31.534919Z [\u001b[32minfo ] Starting training dataset=11012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:30:44.186803Z [\u001b[32minfo ] Training complete train_loss=0.04117933288216591\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:30:44.188896Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:30:52.585989Z [\u001b[32minfo ] Evaluation complete test_loss=0.4863777160644531\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:30:52.589908Z [\u001b[32minfo ] Start Predict dataset=3988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:31:31.716610Z [\u001b[32minfo ] Starting training dataset=11112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:33:46.383277Z [\u001b[32minfo ] Training complete train_loss=0.04627871885895729\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:33:46.385149Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:33:54.779086Z [\u001b[32minfo ] Evaluation complete test_loss=0.470214307308197\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:33:54.783126Z [\u001b[32minfo ] Start Predict dataset=3888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:34:33.190703Z [\u001b[32minfo ] Starting training dataset=11212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:36:48.681686Z [\u001b[32minfo ] Training complete train_loss=0.04613247141242027\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:36:48.683987Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:36:57.278346Z [\u001b[32minfo ] Evaluation complete test_loss=0.4558008015155792\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:36:57.282786Z [\u001b[32minfo ] Start Predict dataset=3788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:37:34.291065Z [\u001b[32minfo ] Starting training dataset=11312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:39:51.289119Z [\u001b[32minfo ] Training complete train_loss=0.02740650437772274\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:39:51.291195Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:39:59.882257Z [\u001b[32minfo ] Evaluation complete test_loss=0.5408679246902466\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:39:59.887662Z [\u001b[32minfo ] Start Predict dataset=3688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:40:35.740146Z [\u001b[32minfo ] Starting training dataset=11412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:42:52.090300Z [\u001b[32minfo ] Training complete train_loss=0.0475490465760231\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:42:52.092186Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:43:00.380313Z [\u001b[32minfo ] Evaluation complete test_loss=0.48505839705467224\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:43:00.384850Z [\u001b[32minfo ] Start Predict dataset=3588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:43:35.409152Z [\u001b[32minfo ] Starting training dataset=11512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:45:53.678636Z [\u001b[32minfo ] Training complete train_loss=0.029358908534049988\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:45:53.680683Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:46:01.986445Z [\u001b[32minfo ] Evaluation complete test_loss=0.5404146313667297\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:46:01.990899Z [\u001b[32minfo ] Start Predict dataset=3488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:46:36.192044Z [\u001b[32minfo ] Starting training dataset=11612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:48:54.491902Z [\u001b[32minfo ] Training complete train_loss=0.03432118520140648\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:48:54.494055Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:49:03.187720Z [\u001b[32minfo ] Evaluation complete test_loss=0.5160902142524719\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:49:03.192422Z [\u001b[32minfo ] Start Predict dataset=3388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:49:36.627034Z [\u001b[32minfo ] Starting training dataset=11712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:51:53.684736Z [\u001b[32minfo ] Training complete train_loss=0.025415150448679924\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:51:53.687300Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:52:02.180711Z [\u001b[32minfo ] Evaluation complete test_loss=0.4870961308479309\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:52:02.185758Z [\u001b[32minfo ] Start Predict dataset=3288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:52:34.039245Z [\u001b[32minfo ] Starting training dataset=11812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:54:52.384529Z [\u001b[32minfo ] Training complete train_loss=0.04198145121335983\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:54:52.386512Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:55:00.588475Z [\u001b[32minfo ] Evaluation complete test_loss=0.4633568823337555\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:55:00.593256Z [\u001b[32minfo ] Start Predict dataset=3188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:55:32.526412Z [\u001b[32minfo ] Starting training dataset=11912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T20:57:52.280876Z [\u001b[32minfo ] Training complete train_loss=0.027177348732948303\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T20:57:52.282884Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T20:58:00.669065Z [\u001b[32minfo ] Evaluation complete test_loss=0.5218877792358398\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T20:58:00.673801Z [\u001b[32minfo ] Start Predict dataset=3088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T20:58:31.200998Z [\u001b[32minfo ] Starting training dataset=12012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:00:51.678552Z [\u001b[32minfo ] Training complete train_loss=0.03196889907121658\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:00:51.680520Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:00:59.969555Z [\u001b[32minfo ] Evaluation complete test_loss=0.4975127875804901\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:00:59.974549Z [\u001b[32minfo ] Start Predict dataset=2988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:01:29.609903Z [\u001b[32minfo ] Starting training dataset=12112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:03:52.395673Z [\u001b[32minfo ] Training complete train_loss=0.029373254626989365\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:03:52.397885Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:04:00.787526Z [\u001b[32minfo ] Evaluation complete test_loss=0.48535236716270447\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:04:00.791903Z [\u001b[32minfo ] Start Predict dataset=2888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:04:29.892682Z [\u001b[32minfo ] Starting training dataset=12212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:06:53.894096Z [\u001b[32minfo ] Training complete train_loss=0.029735658317804337\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:06:53.895831Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:07:02.080102Z [\u001b[32minfo ] Evaluation complete test_loss=0.5647792220115662\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:07:02.084360Z [\u001b[32minfo ] Start Predict dataset=2788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:07:30.166622Z [\u001b[32minfo ] Starting training dataset=12312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:09:55.179011Z [\u001b[32minfo ] Training complete train_loss=0.0220402330160141\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:09:55.180688Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:10:03.477053Z [\u001b[32minfo ] Evaluation complete test_loss=0.5359561443328857\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:10:03.481581Z [\u001b[32minfo ] Start Predict dataset=2688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:10:30.528810Z [\u001b[32minfo ] Starting training dataset=12412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:12:54.091861Z [\u001b[32minfo ] Training complete train_loss=0.03964528813958168\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:12:54.094102Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:13:02.369356Z [\u001b[32minfo ] Evaluation complete test_loss=0.48836764693260193\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:13:02.374057Z [\u001b[32minfo ] Start Predict dataset=2588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:13:27.496414Z [\u001b[32minfo ] Starting training dataset=12512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:15:48.582187Z [\u001b[32minfo ] Training complete train_loss=0.04704689979553223\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:15:48.584036Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:15:56.668202Z [\u001b[32minfo ] Evaluation complete test_loss=0.44271594285964966\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:15:56.672079Z [\u001b[32minfo ] Start Predict dataset=2488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:16:21.327724Z [\u001b[32minfo ] Starting training dataset=12612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:18:50.083526Z [\u001b[32minfo ] Training complete train_loss=0.0628628060221672\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:18:50.085875Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:18:58.770438Z [\u001b[32minfo ] Evaluation complete test_loss=0.503575325012207\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:18:58.775111Z [\u001b[32minfo ] Start Predict dataset=2388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:19:22.821996Z [\u001b[32minfo ] Starting training dataset=12712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:21:51.483589Z [\u001b[32minfo ] Training complete train_loss=0.02907649055123329\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:21:51.485452Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:21:59.875917Z [\u001b[32minfo ] Evaluation complete test_loss=0.5075518488883972\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:21:59.879896Z [\u001b[32minfo ] Start Predict dataset=2288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:22:23.219614Z [\u001b[32minfo ] Starting training dataset=12812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:24:51.385507Z [\u001b[32minfo ] Training complete train_loss=0.030792735517024994\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:24:51.387626Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:24:59.785684Z [\u001b[32minfo ] Evaluation complete test_loss=0.44357359409332275\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:24:59.790069Z [\u001b[32minfo ] Start Predict dataset=2188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:25:22.213913Z [\u001b[32minfo ] Starting training dataset=12912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:27:50.994998Z [\u001b[32minfo ] Training complete train_loss=0.03515024855732918\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:27:50.996822Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:27:59.080001Z [\u001b[32minfo ] Evaluation complete test_loss=0.45534369349479675\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:27:59.084982Z [\u001b[32minfo ] Start Predict dataset=2088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:28:19.309633Z [\u001b[32minfo ] Starting training dataset=13012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:30:47.379066Z [\u001b[32minfo ] Training complete train_loss=0.028724441304802895\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:30:47.381229Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:30:55.668478Z [\u001b[32minfo ] Evaluation complete test_loss=0.5947825908660889\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:30:55.672879Z [\u001b[32minfo ] Start Predict dataset=1988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:31:16.513428Z [\u001b[32minfo ] Starting training dataset=13112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:33:46.087868Z [\u001b[32minfo ] Training complete train_loss=0.03632659092545509\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:33:46.090190Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:33:54.470903Z [\u001b[32minfo ] Evaluation complete test_loss=0.3916391432285309\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:33:54.475662Z [\u001b[32minfo ] Start Predict dataset=1888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:34:13.817268Z [\u001b[32minfo ] Starting training dataset=13212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:36:44.087828Z [\u001b[32minfo ] Training complete train_loss=0.0293908528983593\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:36:44.089967Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:36:52.285403Z [\u001b[32minfo ] Evaluation complete test_loss=0.553371787071228\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:36:52.289324Z [\u001b[32minfo ] Start Predict dataset=1788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:37:10.997355Z [\u001b[32minfo ] Starting training dataset=13312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:39:42.870680Z [\u001b[32minfo ] Training complete train_loss=0.024943392723798752\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:39:42.872809Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:39:51.376196Z [\u001b[32minfo ] Evaluation complete test_loss=0.5904661417007446\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:39:51.380310Z [\u001b[32minfo ] Start Predict dataset=1688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:40:09.211327Z [\u001b[32minfo ] Starting training dataset=13412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:42:40.875515Z [\u001b[32minfo ] Training complete train_loss=0.03411614149808884\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:42:40.877663Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:42:49.181025Z [\u001b[32minfo ] Evaluation complete test_loss=0.5226052403450012\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:42:49.185080Z [\u001b[32minfo ] Start Predict dataset=1588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:43:06.385285Z [\u001b[32minfo ] Starting training dataset=13512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:45:39.896472Z [\u001b[32minfo ] Training complete train_loss=0.02790781483054161\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:45:39.898469Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:45:48.768464Z [\u001b[32minfo ] Evaluation complete test_loss=0.48294728994369507\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:45:48.773007Z [\u001b[32minfo ] Start Predict dataset=1488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:46:05.101706Z [\u001b[32minfo ] Starting training dataset=13612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:48:38.587002Z [\u001b[32minfo ] Training complete train_loss=0.021476466208696365\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:48:38.589038Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:48:46.967470Z [\u001b[32minfo ] Evaluation complete test_loss=0.6516388654708862\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:48:46.971723Z [\u001b[32minfo ] Start Predict dataset=1388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:49:02.400625Z [\u001b[32minfo ] Starting training dataset=13712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:51:38.993724Z [\u001b[32minfo ] Training complete train_loss=0.02276446670293808\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:51:38.996209Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:51:47.371012Z [\u001b[32minfo ] Evaluation complete test_loss=0.5613048076629639\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:51:47.375167Z [\u001b[32minfo ] Start Predict dataset=1288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:52:01.682782Z [\u001b[32minfo ] Starting training dataset=13812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:54:41.984078Z [\u001b[32minfo ] Training complete train_loss=0.028370166197419167\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:54:41.985861Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:54:50.184247Z [\u001b[32minfo ] Evaluation complete test_loss=0.4826107919216156\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:54:50.188377Z [\u001b[32minfo ] Start Predict dataset=1188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:55:03.498739Z [\u001b[32minfo ] Starting training dataset=13912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T21:57:39.587512Z [\u001b[32minfo ] Training complete train_loss=0.022563118487596512\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T21:57:39.589138Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T21:57:48.077240Z [\u001b[32minfo ] Evaluation complete test_loss=0.5660622715950012\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T21:57:48.080857Z [\u001b[32minfo ] Start Predict dataset=1088\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T21:58:00.385493Z [\u001b[32minfo ] Starting training dataset=14012 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:00:36.183661Z [\u001b[32minfo ] Training complete train_loss=0.019728390499949455\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:00:36.185580Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:00:44.479725Z [\u001b[32minfo ] Evaluation complete test_loss=0.5509332418441772\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:00:44.484148Z [\u001b[32minfo ] Start Predict dataset=988\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:00:56.003275Z [\u001b[32minfo ] Starting training dataset=14112 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:03:32.995503Z [\u001b[32minfo ] Training complete train_loss=0.03207049146294594\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:03:32.997422Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:03:41.378439Z [\u001b[32minfo ] Evaluation complete test_loss=0.5133938193321228\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:03:41.382642Z [\u001b[32minfo ] Start Predict dataset=888\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:03:52.104510Z [\u001b[32minfo ] Starting training dataset=14212 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:06:29.792679Z [\u001b[32minfo ] Training complete train_loss=0.037114620208740234\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:06:29.794621Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:06:38.077351Z [\u001b[32minfo ] Evaluation complete test_loss=0.5605881214141846\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:06:38.082232Z [\u001b[32minfo ] Start Predict dataset=788\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:06:48.003399Z [\u001b[32minfo ] Starting training dataset=14312 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:09:27.080457Z [\u001b[32minfo ] Training complete train_loss=0.02423758991062641\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:09:27.082417Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:09:35.575955Z [\u001b[32minfo ] Evaluation complete test_loss=0.4343477487564087\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:09:35.580152Z [\u001b[32minfo ] Start Predict dataset=688\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:09:44.492943Z [\u001b[32minfo ] Starting training dataset=14412 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:12:23.871556Z [\u001b[32minfo ] Training complete train_loss=0.025025462731719017\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:12:23.873606Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:12:32.375955Z [\u001b[32minfo ] Evaluation complete test_loss=0.49787238240242004\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:12:32.380195Z [\u001b[32minfo ] Start Predict dataset=588\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:12:40.275824Z [\u001b[32minfo ] Starting training dataset=14512 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:15:20.281568Z [\u001b[32minfo ] Training complete train_loss=0.025414831936359406\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:15:20.283811Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:15:28.687757Z [\u001b[32minfo ] Evaluation complete test_loss=0.5783530473709106\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:15:28.691740Z [\u001b[32minfo ] Start Predict dataset=488\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:15:35.699876Z [\u001b[32minfo ] Starting training dataset=14612 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:18:15.677656Z [\u001b[32minfo ] Training complete train_loss=0.03256714344024658\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:18:15.679679Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:18:23.788199Z [\u001b[32minfo ] Evaluation complete test_loss=0.4111860692501068\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:18:23.792636Z [\u001b[32minfo ] Start Predict dataset=388\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:18:29.996618Z [\u001b[32minfo ] Starting training dataset=14712 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:21:10.987781Z [\u001b[32minfo ] Training complete train_loss=0.031116826459765434\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:21:10.989619Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:21:19.588062Z [\u001b[32minfo ] Evaluation complete test_loss=0.6319525837898254\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:21:19.592013Z [\u001b[32minfo ] Start Predict dataset=288\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:21:24.766597Z [\u001b[32minfo ] Starting training dataset=14812 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:24:06.881760Z [\u001b[32minfo ] Training complete train_loss=0.029832642525434494\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:24:06.884590Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:24:15.278154Z [\u001b[32minfo ] Evaluation complete test_loss=0.47507792711257935\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:24:15.282500Z [\u001b[32minfo ] Start Predict dataset=188\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:24:19.298270Z [\u001b[32minfo ] Starting training dataset=14912 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:27:02.087124Z [\u001b[32minfo ] Training complete train_loss=0.028897961601614952\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:27:02.089190Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:27:10.469021Z [\u001b[32minfo ] Evaluation complete test_loss=0.4525805115699768\n",
- "[131537-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T22:27:10.473399Z [\u001b[32minfo ] Start Predict dataset=88\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T22:27:13.881440Z [\u001b[32minfo ] Starting training dataset=15000 epoch=10\n",
- "[131537-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T22:29:57.197467Z [\u001b[32minfo ] Training complete train_loss=0.03034781664609909\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T22:29:57.199424Z [\u001b[32minfo ] Starting evaluating dataset=3000\n",
- "[131537-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T22:30:05.574946Z [\u001b[32minfo ] Evaluation complete test_loss=0.45382431149482727\n"
- ]
+ "execution_count": null,
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n",
+ "is_executing": true
}
- ],
+ },
+ "outputs": [],
"source": [
"labelling_progress = active_set._labelled.copy().astype(np.uint16)\n",
"for epoch in tqdm(range(hyperparams.epoch)):\n",
@@ -1142,7 +417,11 @@
{
"cell_type": "code",
"execution_count": 7,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stdout",
@@ -1162,7 +441,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## Visualization\n",
"\n",
@@ -1175,7 +458,11 @@
{
"cell_type": "code",
"execution_count": 8,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"# modify our model to get features\n",
@@ -1203,7 +490,11 @@
{
"cell_type": "code",
"execution_count": 9,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from MulticoreTSNE import MulticoreTSNE as TSNE\n",
@@ -1216,7 +507,11 @@
{
"cell_type": "code",
"execution_count": 10,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -1249,7 +544,11 @@
{
"cell_type": "code",
"execution_count": 11,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from baal.utils.plot_utils import make_animation_from_data\n",
@@ -273855,4 +273154,4 @@
},
"nbformat": 4,
"nbformat_minor": 4
-}
+}
\ No newline at end of file
diff --git a/notebooks/deep_ensemble.ipynb b/notebooks/deep_ensemble.ipynb
index 79550a46..4dac30b8 100644
--- a/notebooks/deep_ensemble.ipynb
+++ b/notebooks/deep_ensemble.ipynb
@@ -2,7 +2,11 @@
"cells": [
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"# How to use Deep ensembles in BaaL\n",
"\n",
@@ -22,7 +26,11 @@
{
"cell_type": "code",
"execution_count": 1,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"import random\n",
@@ -59,7 +67,11 @@
{
"cell_type": "code",
"execution_count": 2,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"@dataclass\n",
@@ -106,7 +118,11 @@
{
"cell_type": "code",
"execution_count": 3,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stdout",
@@ -157,7 +173,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## Presenting EnsembleModelWrapper\n",
"\n",
@@ -185,1079 +205,14 @@
},
{
"cell_type": "code",
- "execution_count": 4,
- "metadata": {},
- "outputs": [
- {
- "data": {
- "application/vnd.jupyter.widget-view+json": {
- "model_id": "080ccd9d4e1f4c7d8f56f08f5572fdab",
- "version_major": 2,
- "version_minor": 0
- },
- "text/plain": [
- " 0%| | 0/58 [00:00, ?it/s]"
- ]
- },
- "metadata": {},
- "output_type": "display_data"
- },
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:03:23.225157Z [\u001B[32minfo ] Starting training dataset=512 epoch=10\n"
- ]
- },
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "/opt/conda/lib/python3.9/site-packages/torch/utils/data/dataloader.py:478: UserWarning: This DataLoader will create 4 worker processes in total. Our suggested max number of worker in current system is 1, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.\n",
- " warnings.warn(_create_warning_msg(\n",
- "/opt/conda/lib/python3.9/site-packages/torch/nn/functional.py:718: UserWarning: Named tensors and all their associated APIs are an experimental feature and subject to change. Please do not use them for anything important until they are released as stable. (Triggered internally at /pytorch/c10/core/TensorImpl.h:1156.)\n",
- " return torch.max_pool2d(input, kernel_size, stride, padding, dilation, ceil_mode)\n"
- ]
- },
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:03:54.272799Z [\u001B[32minfo ] Training complete train_loss=0.8364141583442688\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:03:54.281157Z [\u001B[32minfo ] Starting training dataset=512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:04:27.085055Z [\u001B[32minfo ] Training complete train_loss=0.8260515332221985\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:04:27.092947Z [\u001B[32minfo ] Starting training dataset=512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:05:00.277956Z [\u001B[32minfo ] Training complete train_loss=0.8478720188140869\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:05:00.286100Z [\u001B[32minfo ] Starting training dataset=512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:05:33.876247Z [\u001B[32minfo ] Training complete train_loss=0.8530490398406982\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:05:33.890366Z [\u001B[32minfo ] Starting training dataset=512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:06:08.077229Z [\u001B[32minfo ] Training complete train_loss=0.8666098117828369\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T15:06:08.084093Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T15:06:22.489413Z [\u001B[32minfo ] Evaluation complete test_loss=1.082019567489624\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T15:06:22.498875Z [\u001B[32minfo ] Start Predict dataset=49488\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:23:19.746892Z [\u001B[32minfo ] Starting training dataset=612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:23:49.982128Z [\u001B[32minfo ] Training complete train_loss=0.8243563771247864\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:23:49.989835Z [\u001B[32minfo ] Starting training dataset=612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:24:21.172522Z [\u001B[32minfo ] Training complete train_loss=0.9930000305175781\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:24:21.180164Z [\u001B[32minfo ] Starting training dataset=612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:24:54.582503Z [\u001B[32minfo ] Training complete train_loss=0.9522870779037476\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:24:54.590289Z [\u001B[32minfo ] Starting training dataset=612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:25:27.677519Z [\u001B[32minfo ] Training complete train_loss=1.0974868535995483\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:25:27.684730Z [\u001B[32minfo ] Starting training dataset=612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:26:00.088513Z [\u001B[32minfo ] Training complete train_loss=1.154961109161377\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T15:26:00.169698Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T15:26:13.881201Z [\u001B[32minfo ] Evaluation complete test_loss=1.050412893295288\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T15:26:13.888829Z [\u001B[32minfo ] Start Predict dataset=49388\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:43:24.920358Z [\u001B[32minfo ] Starting training dataset=712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:43:57.485136Z [\u001B[32minfo ] Training complete train_loss=0.932747483253479\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:43:57.492869Z [\u001B[32minfo ] Starting training dataset=712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:44:30.077726Z [\u001B[32minfo ] Training complete train_loss=0.918374240398407\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:44:30.085204Z [\u001B[32minfo ] Starting training dataset=712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:45:02.867968Z [\u001B[32minfo ] Training complete train_loss=0.9606548547744751\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:45:02.875076Z [\u001B[32minfo ] Starting training dataset=712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:45:35.279739Z [\u001B[32minfo ] Training complete train_loss=1.016992211341858\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T15:45:35.287374Z [\u001B[32minfo ] Starting training dataset=712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T15:46:07.184076Z [\u001B[32minfo ] Training complete train_loss=0.9882396459579468\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T15:46:07.189735Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T15:46:20.788120Z [\u001B[32minfo ] Evaluation complete test_loss=1.0184006690979004\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T15:46:20.795889Z [\u001B[32minfo ] Start Predict dataset=49288\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:03:30.744977Z [\u001B[32minfo ] Starting training dataset=812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:04:04.781179Z [\u001B[32minfo ] Training complete train_loss=0.9256729483604431\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:04:04.788756Z [\u001B[32minfo ] Starting training dataset=812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:04:38.886644Z [\u001B[32minfo ] Training complete train_loss=0.8962534666061401\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:04:38.894324Z [\u001B[32minfo ] Starting training dataset=812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:05:12.584859Z [\u001B[32minfo ] Training complete train_loss=1.099798560142517\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:05:12.592090Z [\u001B[32minfo ] Starting training dataset=812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:05:46.968839Z [\u001B[32minfo ] Training complete train_loss=0.9000493288040161\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:05:46.976785Z [\u001B[32minfo ] Starting training dataset=812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:06:20.685078Z [\u001B[32minfo ] Training complete train_loss=0.9171968698501587\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T16:06:20.690647Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T16:06:34.579119Z [\u001B[32minfo ] Evaluation complete test_loss=0.9765419960021973\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T16:06:34.586519Z [\u001B[32minfo ] Start Predict dataset=49188\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:23:45.107789Z [\u001B[32minfo ] Starting training dataset=912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:24:20.079465Z [\u001B[32minfo ] Training complete train_loss=0.8302789926528931\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:24:20.086495Z [\u001B[32minfo ] Starting training dataset=912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:24:54.285867Z [\u001B[32minfo ] Training complete train_loss=0.9873713254928589\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:24:54.295280Z [\u001B[32minfo ] Starting training dataset=912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:25:28.584150Z [\u001B[32minfo ] Training complete train_loss=0.8690028786659241\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:25:28.591403Z [\u001B[32minfo ] Starting training dataset=912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:26:02.984630Z [\u001B[32minfo ] Training complete train_loss=0.9912691712379456\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:26:02.991666Z [\u001B[32minfo ] Starting training dataset=912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:26:37.672282Z [\u001B[32minfo ] Training complete train_loss=0.9958629012107849\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T16:26:37.677315Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T16:26:51.273046Z [\u001B[32minfo ] Evaluation complete test_loss=0.9095031023025513\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T16:26:51.281388Z [\u001B[32minfo ] Start Predict dataset=49088\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:43:41.295745Z [\u001B[32minfo ] Starting training dataset=1012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:44:16.976296Z [\u001B[32minfo ] Training complete train_loss=0.9254322052001953\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:44:16.983815Z [\u001B[32minfo ] Starting training dataset=1012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:44:52.478948Z [\u001B[32minfo ] Training complete train_loss=1.0186196565628052\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:44:52.485844Z [\u001B[32minfo ] Starting training dataset=1012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:45:27.668294Z [\u001B[32minfo ] Training complete train_loss=0.9414865970611572\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:45:27.675891Z [\u001B[32minfo ] Starting training dataset=1012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:46:03.285576Z [\u001B[32minfo ] Training complete train_loss=0.9331195950508118\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T16:46:03.292214Z [\u001B[32minfo ] Starting training dataset=1012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T16:46:39.089283Z [\u001B[32minfo ] Training complete train_loss=1.0527617931365967\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T16:46:39.165706Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T16:46:52.691071Z [\u001B[32minfo ] Evaluation complete test_loss=0.9129394888877869\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T16:46:52.769545Z [\u001B[32minfo ] Start Predict dataset=48988\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:03:22.458586Z [\u001B[32minfo ] Starting training dataset=1112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:03:59.079071Z [\u001B[32minfo ] Training complete train_loss=0.9513149857521057\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:03:59.085982Z [\u001B[32minfo ] Starting training dataset=1112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:04:35.268902Z [\u001B[32minfo ] Training complete train_loss=1.043251633644104\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:04:35.276104Z [\u001B[32minfo ] Starting training dataset=1112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:05:11.276178Z [\u001B[32minfo ] Training complete train_loss=0.911968469619751\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:05:11.283077Z [\u001B[32minfo ] Starting training dataset=1112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:05:47.967951Z [\u001B[32minfo ] Training complete train_loss=0.9133453369140625\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:05:47.975760Z [\u001B[32minfo ] Starting training dataset=1112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:06:23.978351Z [\u001B[32minfo ] Training complete train_loss=0.9577516913414001\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T17:06:23.983301Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T17:06:36.879626Z [\u001B[32minfo ] Evaluation complete test_loss=0.8657751083374023\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T17:06:36.886557Z [\u001B[32minfo ] Start Predict dataset=48888\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:23:02.721417Z [\u001B[32minfo ] Starting training dataset=1212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:23:39.677137Z [\u001B[32minfo ] Training complete train_loss=1.0441893339157104\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:23:39.685051Z [\u001B[32minfo ] Starting training dataset=1212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:24:16.871166Z [\u001B[32minfo ] Training complete train_loss=0.929194986820221\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:24:16.881245Z [\u001B[32minfo ] Starting training dataset=1212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:24:54.368581Z [\u001B[32minfo ] Training complete train_loss=1.0284929275512695\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:24:54.375269Z [\u001B[32minfo ] Starting training dataset=1212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:25:31.480339Z [\u001B[32minfo ] Training complete train_loss=0.9321802854537964\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:25:31.487068Z [\u001B[32minfo ] Starting training dataset=1212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:26:09.074131Z [\u001B[32minfo ] Training complete train_loss=0.973421573638916\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T17:26:09.080513Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T17:26:22.782218Z [\u001B[32minfo ] Evaluation complete test_loss=0.8525174260139465\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T17:26:22.793393Z [\u001B[32minfo ] Start Predict dataset=48788\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:42:46.221663Z [\u001B[32minfo ] Starting training dataset=1312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:43:24.664875Z [\u001B[32minfo ] Training complete train_loss=0.9666503071784973\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:43:24.672229Z [\u001B[32minfo ] Starting training dataset=1312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:44:03.190809Z [\u001B[32minfo ] Training complete train_loss=1.002950668334961\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:44:03.269028Z [\u001B[32minfo ] Starting training dataset=1312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:44:41.075336Z [\u001B[32minfo ] Training complete train_loss=0.9888299107551575\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:44:41.082244Z [\u001B[32minfo ] Starting training dataset=1312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:45:19.077166Z [\u001B[32minfo ] Training complete train_loss=0.97415691614151\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T17:45:19.084448Z [\u001B[32minfo ] Starting training dataset=1312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T17:45:57.374593Z [\u001B[32minfo ] Training complete train_loss=1.0200046300888062\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T17:45:57.380081Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T17:46:10.685676Z [\u001B[32minfo ] Evaluation complete test_loss=0.8271186947822571\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T17:46:10.693032Z [\u001B[32minfo ] Start Predict dataset=48688\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:02:32.635236Z [\u001B[32minfo ] Starting training dataset=1412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:03:12.272650Z [\u001B[32minfo ] Training complete train_loss=1.075010895729065\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:03:12.280246Z [\u001B[32minfo ] Starting training dataset=1412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:03:52.190764Z [\u001B[32minfo ] Training complete train_loss=0.992854654788971\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:03:52.198475Z [\u001B[32minfo ] Starting training dataset=1412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:04:32.369968Z [\u001B[32minfo ] Training complete train_loss=1.0081067085266113\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:04:32.376590Z [\u001B[32minfo ] Starting training dataset=1412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:05:12.665417Z [\u001B[32minfo ] Training complete train_loss=0.9663589596748352\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:05:12.672271Z [\u001B[32minfo ] Starting training dataset=1412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:05:52.280446Z [\u001B[32minfo ] Training complete train_loss=1.04006028175354\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T18:05:52.285840Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T18:06:05.572855Z [\u001B[32minfo ] Evaluation complete test_loss=0.804664671421051\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T18:06:05.580908Z [\u001B[32minfo ] Start Predict dataset=48588\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:22:25.401360Z [\u001B[32minfo ] Starting training dataset=1512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:23:06.483224Z [\u001B[32minfo ] Training complete train_loss=0.9277119040489197\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:23:06.491979Z [\u001B[32minfo ] Starting training dataset=1512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:23:48.188278Z [\u001B[32minfo ] Training complete train_loss=1.0428746938705444\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:23:48.267931Z [\u001B[32minfo ] Starting training dataset=1512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:24:30.874456Z [\u001B[32minfo ] Training complete train_loss=0.9510531425476074\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:24:30.881489Z [\u001B[32minfo ] Starting training dataset=1512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:25:12.688178Z [\u001B[32minfo ] Training complete train_loss=0.9416515827178955\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:25:12.766617Z [\u001B[32minfo ] Starting training dataset=1512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:25:53.775859Z [\u001B[32minfo ] Training complete train_loss=0.9402214884757996\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T18:25:53.780544Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T18:26:06.993932Z [\u001B[32minfo ] Evaluation complete test_loss=0.7674832940101624\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T18:26:07.068616Z [\u001B[32minfo ] Start Predict dataset=48488\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:42:29.514702Z [\u001B[32minfo ] Starting training dataset=1612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:43:12.192095Z [\u001B[32minfo ] Training complete train_loss=1.028326153755188\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:43:12.266413Z [\u001B[32minfo ] Starting training dataset=1612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:43:53.779097Z [\u001B[32minfo ] Training complete train_loss=0.9863273501396179\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:43:53.786243Z [\u001B[32minfo ] Starting training dataset=1612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:44:36.290856Z [\u001B[32minfo ] Training complete train_loss=0.9068755507469177\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:44:36.366920Z [\u001B[32minfo ] Starting training dataset=1612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:45:18.870138Z [\u001B[32minfo ] Training complete train_loss=1.023092269897461\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T18:45:18.877238Z [\u001B[32minfo ] Starting training dataset=1612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T18:46:00.383976Z [\u001B[32minfo ] Training complete train_loss=0.965911328792572\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T18:46:00.388895Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T18:46:13.684169Z [\u001B[32minfo ] Evaluation complete test_loss=0.7569476962089539\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T18:46:13.691282Z [\u001B[32minfo ] Start Predict dataset=48388\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:02:30.984315Z [\u001B[32minfo ] Starting training dataset=1712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:03:15.674997Z [\u001B[32minfo ] Training complete train_loss=0.9350682497024536\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:03:15.681606Z [\u001B[32minfo ] Starting training dataset=1712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:03:58.887455Z [\u001B[32minfo ] Training complete train_loss=0.9201800227165222\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:03:58.894480Z [\u001B[32minfo ] Starting training dataset=1712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:04:41.578060Z [\u001B[32minfo ] Training complete train_loss=0.905670166015625\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:04:41.585305Z [\u001B[32minfo ] Starting training dataset=1712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:05:24.475178Z [\u001B[32minfo ] Training complete train_loss=0.9620814323425293\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:05:24.482448Z [\u001B[32minfo ] Starting training dataset=1712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:06:07.490241Z [\u001B[32minfo ] Training complete train_loss=0.9224438071250916\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T19:06:07.570348Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T19:06:20.696662Z [\u001B[32minfo ] Evaluation complete test_loss=0.7361646890640259\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T19:06:20.771211Z [\u001B[32minfo ] Start Predict dataset=48288\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:22:41.163738Z [\u001B[32minfo ] Starting training dataset=1812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:23:25.570368Z [\u001B[32minfo ] Training complete train_loss=0.9820511937141418\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:23:25.579157Z [\u001B[32minfo ] Starting training dataset=1812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:24:10.075295Z [\u001B[32minfo ] Training complete train_loss=0.9375017881393433\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:24:10.083318Z [\u001B[32minfo ] Starting training dataset=1812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:24:54.880439Z [\u001B[32minfo ] Training complete train_loss=0.9257637858390808\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:24:54.888091Z [\u001B[32minfo ] Starting training dataset=1812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:25:39.078416Z [\u001B[32minfo ] Training complete train_loss=0.9971398711204529\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:25:39.086671Z [\u001B[32minfo ] Starting training dataset=1812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:26:23.472168Z [\u001B[32minfo ] Training complete train_loss=1.013749599456787\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T19:26:23.476976Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T19:26:36.574364Z [\u001B[32minfo ] Evaluation complete test_loss=0.756501317024231\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T19:26:36.581478Z [\u001B[32minfo ] Start Predict dataset=48188\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:43:22.766592Z [\u001B[32minfo ] Starting training dataset=1912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:44:11.265697Z [\u001B[32minfo ] Training complete train_loss=0.9557903409004211\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:44:11.274063Z [\u001B[32minfo ] Starting training dataset=1912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:44:59.982236Z [\u001B[32minfo ] Training complete train_loss=0.9173471927642822\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:44:59.989619Z [\u001B[32minfo ] Starting training dataset=1912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:45:48.168328Z [\u001B[32minfo ] Training complete train_loss=1.0321499109268188\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:45:48.176667Z [\u001B[32minfo ] Starting training dataset=1912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:46:35.773009Z [\u001B[32minfo ] Training complete train_loss=0.9191875457763672\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T19:46:35.781286Z [\u001B[32minfo ] Starting training dataset=1912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T19:47:22.576191Z [\u001B[32minfo ] Training complete train_loss=0.9246641397476196\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T19:47:22.581554Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T19:47:36.582237Z [\u001B[32minfo ] Evaluation complete test_loss=0.7167291045188904\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T19:47:36.589409Z [\u001B[32minfo ] Start Predict dataset=48088\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:04:31.552071Z [\u001B[32minfo ] Starting training dataset=2012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:05:20.382208Z [\u001B[32minfo ] Training complete train_loss=0.9722501039505005\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:05:20.390139Z [\u001B[32minfo ] Starting training dataset=2012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:06:09.175924Z [\u001B[32minfo ] Training complete train_loss=0.9446245431900024\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:06:09.183759Z [\u001B[32minfo ] Starting training dataset=2012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:06:58.075425Z [\u001B[32minfo ] Training complete train_loss=0.9757698774337769\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:06:58.083521Z [\u001B[32minfo ] Starting training dataset=2012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:07:46.880669Z [\u001B[32minfo ] Training complete train_loss=0.9409571886062622\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:07:46.888642Z [\u001B[32minfo ] Starting training dataset=2012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:08:34.876303Z [\u001B[32minfo ] Training complete train_loss=0.9852419495582581\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T20:08:34.881729Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T20:08:48.985392Z [\u001B[32minfo ] Evaluation complete test_loss=0.723760724067688\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T20:08:48.992518Z [\u001B[32minfo ] Start Predict dataset=47988\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:25:34.747891Z [\u001B[32minfo ] Starting training dataset=2112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:26:23.872387Z [\u001B[32minfo ] Training complete train_loss=0.9171951413154602\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:26:23.880777Z [\u001B[32minfo ] Starting training dataset=2112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:27:13.165485Z [\u001B[32minfo ] Training complete train_loss=0.9799726009368896\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:27:13.173679Z [\u001B[32minfo ] Starting training dataset=2112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:28:03.283328Z [\u001B[32minfo ] Training complete train_loss=0.9386671185493469\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:28:03.290982Z [\u001B[32minfo ] Starting training dataset=2112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:28:52.777442Z [\u001B[32minfo ] Training complete train_loss=0.978201687335968\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:28:52.784738Z [\u001B[32minfo ] Starting training dataset=2112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:29:42.877656Z [\u001B[32minfo ] Training complete train_loss=0.9618527293205261\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T20:29:42.883872Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T20:29:57.578619Z [\u001B[32minfo ] Evaluation complete test_loss=0.7093442678451538\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T20:29:57.585998Z [\u001B[32minfo ] Start Predict dataset=47888\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:46:40.178333Z [\u001B[32minfo ] Starting training dataset=2212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:47:31.289002Z [\u001B[32minfo ] Training complete train_loss=0.9138000011444092\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:47:31.296781Z [\u001B[32minfo ] Starting training dataset=2212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:48:21.980726Z [\u001B[32minfo ] Training complete train_loss=1.0114140510559082\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:48:21.987649Z [\u001B[32minfo ] Starting training dataset=2212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:49:12.391125Z [\u001B[32minfo ] Training complete train_loss=1.171674370765686\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:49:12.467107Z [\u001B[32minfo ] Starting training dataset=2212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:50:02.169114Z [\u001B[32minfo ] Training complete train_loss=0.9495107531547546\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T20:50:02.177071Z [\u001B[32minfo ] Starting training dataset=2212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T20:50:52.579582Z [\u001B[32minfo ] Training complete train_loss=1.0180065631866455\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T20:50:52.585367Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T20:51:06.175383Z [\u001B[32minfo ] Evaluation complete test_loss=0.7294108271598816\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T20:51:06.183712Z [\u001B[32minfo ] Start Predict dataset=47788\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:07:22.539634Z [\u001B[32minfo ] Starting training dataset=2312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:08:16.280207Z [\u001B[32minfo ] Training complete train_loss=0.9690930247306824\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:08:16.286669Z [\u001B[32minfo ] Starting training dataset=2312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:09:09.184223Z [\u001B[32minfo ] Training complete train_loss=0.9622244834899902\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:09:09.191550Z [\u001B[32minfo ] Starting training dataset=2312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:10:02.666252Z [\u001B[32minfo ] Training complete train_loss=0.9211553335189819\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:10:02.674865Z [\u001B[32minfo ] Starting training dataset=2312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:10:57.070214Z [\u001B[32minfo ] Training complete train_loss=0.9597650766372681\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:10:57.077798Z [\u001B[32minfo ] Starting training dataset=2312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:11:50.182542Z [\u001B[32minfo ] Training complete train_loss=0.964903712272644\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T21:11:50.188292Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T21:12:03.866348Z [\u001B[32minfo ] Evaluation complete test_loss=0.6798441410064697\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T21:12:03.874067Z [\u001B[32minfo ] Start Predict dataset=47688\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:28:11.883129Z [\u001B[32minfo ] Starting training dataset=2412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:29:03.879859Z [\u001B[32minfo ] Training complete train_loss=1.0068079233169556\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:29:03.887980Z [\u001B[32minfo ] Starting training dataset=2412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:29:56.381578Z [\u001B[32minfo ] Training complete train_loss=0.9520434141159058\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:29:56.390312Z [\u001B[32minfo ] Starting training dataset=2412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:30:48.191878Z [\u001B[32minfo ] Training complete train_loss=0.9582843780517578\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:30:48.266421Z [\u001B[32minfo ] Starting training dataset=2412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:31:40.281782Z [\u001B[32minfo ] Training complete train_loss=1.0118552446365356\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:31:40.288720Z [\u001B[32minfo ] Starting training dataset=2412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:32:32.970914Z [\u001B[32minfo ] Training complete train_loss=0.9314520955085754\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T21:32:32.976789Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T21:32:46.474366Z [\u001B[32minfo ] Evaluation complete test_loss=0.6736973524093628\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T21:32:46.480994Z [\u001B[32minfo ] Start Predict dataset=47588\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:48:57.148416Z [\u001B[32minfo ] Starting training dataset=2512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:49:50.280161Z [\u001B[32minfo ] Training complete train_loss=0.9337089657783508\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:49:50.286868Z [\u001B[32minfo ] Starting training dataset=2512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:50:43.675634Z [\u001B[32minfo ] Training complete train_loss=1.005215048789978\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:50:43.686572Z [\u001B[32minfo ] Starting training dataset=2512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:51:37.178821Z [\u001B[32minfo ] Training complete train_loss=0.9765651226043701\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:51:37.185438Z [\u001B[32minfo ] Starting training dataset=2512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:52:30.772492Z [\u001B[32minfo ] Training complete train_loss=1.0099214315414429\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T21:52:30.780204Z [\u001B[32minfo ] Starting training dataset=2512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T21:53:25.086434Z [\u001B[32minfo ] Training complete train_loss=1.1066418886184692\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T21:53:25.092135Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T21:53:38.890457Z [\u001B[32minfo ] Evaluation complete test_loss=0.6832886934280396\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T21:53:38.968482Z [\u001B[32minfo ] Start Predict dataset=47488\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:09:47.037387Z [\u001B[32minfo ] Starting training dataset=2612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:10:41.882671Z [\u001B[32minfo ] Training complete train_loss=0.9732304811477661\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:10:41.889490Z [\u001B[32minfo ] Starting training dataset=2612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:11:36.080223Z [\u001B[32minfo ] Training complete train_loss=0.9936309456825256\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:11:36.087166Z [\u001B[32minfo ] Starting training dataset=2612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:12:30.385642Z [\u001B[32minfo ] Training complete train_loss=0.8979130983352661\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:12:30.392645Z [\u001B[32minfo ] Starting training dataset=2612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:13:24.886768Z [\u001B[32minfo ] Training complete train_loss=1.0363106727600098\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:13:24.894615Z [\u001B[32minfo ] Starting training dataset=2612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:14:19.284788Z [\u001B[32minfo ] Training complete train_loss=0.9428572058677673\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T22:14:19.289613Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T22:14:32.774810Z [\u001B[32minfo ] Evaluation complete test_loss=0.6570330858230591\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T22:14:32.781457Z [\u001B[32minfo ] Start Predict dataset=47388\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:30:41.520167Z [\u001B[32minfo ] Starting training dataset=2712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:31:38.892835Z [\u001B[32minfo ] Training complete train_loss=0.922064483165741\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:31:38.971116Z [\u001B[32minfo ] Starting training dataset=2712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:32:34.989046Z [\u001B[32minfo ] Training complete train_loss=0.8859795331954956\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:32:35.067765Z [\u001B[32minfo ] Starting training dataset=2712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:33:32.582960Z [\u001B[32minfo ] Training complete train_loss=0.9763253331184387\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:33:32.590187Z [\u001B[32minfo ] Starting training dataset=2712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:34:29.085416Z [\u001B[32minfo ] Training complete train_loss=0.9625152349472046\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:34:29.091940Z [\u001B[32minfo ] Starting training dataset=2712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:35:25.181361Z [\u001B[32minfo ] Training complete train_loss=0.9341091513633728\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T22:35:25.186428Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T22:35:39.483789Z [\u001B[32minfo ] Evaluation complete test_loss=0.6503035426139832\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T22:35:39.491229Z [\u001B[32minfo ] Start Predict dataset=47288\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:51:37.149598Z [\u001B[32minfo ] Starting training dataset=2812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:52:33.470586Z [\u001B[32minfo ] Training complete train_loss=0.9524716734886169\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:52:33.477044Z [\u001B[32minfo ] Starting training dataset=2812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:53:29.469538Z [\u001B[32minfo ] Training complete train_loss=0.9347421526908875\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:53:29.476470Z [\u001B[32minfo ] Starting training dataset=2812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:54:25.470438Z [\u001B[32minfo ] Training complete train_loss=0.96184903383255\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:54:25.477112Z [\u001B[32minfo ] Starting training dataset=2812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:55:21.368596Z [\u001B[32minfo ] Training complete train_loss=0.9115388989448547\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T22:55:21.375550Z [\u001B[32minfo ] Starting training dataset=2812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T22:56:17.476897Z [\u001B[32minfo ] Training complete train_loss=0.9248518347740173\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T22:56:17.481429Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T22:56:30.870313Z [\u001B[32minfo ] Evaluation complete test_loss=0.6633090972900391\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T22:56:30.876905Z [\u001B[32minfo ] Start Predict dataset=47188\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:12:24.288451Z [\u001B[32minfo ] Starting training dataset=2912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:13:21.378543Z [\u001B[32minfo ] Training complete train_loss=0.9456859230995178\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:13:21.385101Z [\u001B[32minfo ] Starting training dataset=2912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:14:18.189772Z [\u001B[32minfo ] Training complete train_loss=0.9629805684089661\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:14:18.267688Z [\u001B[32minfo ] Starting training dataset=2912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:15:15.173773Z [\u001B[32minfo ] Training complete train_loss=0.9584450125694275\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:15:15.180834Z [\u001B[32minfo ] Starting training dataset=2912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:16:12.666025Z [\u001B[32minfo ] Training complete train_loss=0.9231571555137634\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:16:12.672462Z [\u001B[32minfo ] Starting training dataset=2912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:17:10.073503Z [\u001B[32minfo ] Training complete train_loss=0.9232361912727356\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T23:17:10.078705Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T23:17:23.270843Z [\u001B[32minfo ] Evaluation complete test_loss=0.6433601975440979\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T23:17:23.277508Z [\u001B[32minfo ] Start Predict dataset=47088\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:33:28.882364Z [\u001B[32minfo ] Starting training dataset=3012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:34:28.174089Z [\u001B[32minfo ] Training complete train_loss=0.9940481781959534\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:34:28.181205Z [\u001B[32minfo ] Starting training dataset=3012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:35:27.384836Z [\u001B[32minfo ] Training complete train_loss=0.9763245582580566\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:35:27.392383Z [\u001B[32minfo ] Starting training dataset=3012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:36:27.078643Z [\u001B[32minfo ] Training complete train_loss=0.9977985620498657\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:36:27.085664Z [\u001B[32minfo ] Starting training dataset=3012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:37:26.387570Z [\u001B[32minfo ] Training complete train_loss=0.9883565902709961\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:37:26.394578Z [\u001B[32minfo ] Starting training dataset=3012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:38:26.078454Z [\u001B[32minfo ] Training complete train_loss=1.0123181343078613\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T23:38:26.084095Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T23:38:39.683168Z [\u001B[32minfo ] Evaluation complete test_loss=0.6295325756072998\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T23:38:39.690456Z [\u001B[32minfo ] Start Predict dataset=46988\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:54:29.465550Z [\u001B[32minfo ] Starting training dataset=3112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:55:29.272588Z [\u001B[32minfo ] Training complete train_loss=0.993098795413971\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:55:29.279219Z [\u001B[32minfo ] Starting training dataset=3112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:56:29.374525Z [\u001B[32minfo ] Training complete train_loss=0.982636570930481\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:56:29.381500Z [\u001B[32minfo ] Starting training dataset=3112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:57:28.969284Z [\u001B[32minfo ] Training complete train_loss=0.9553168416023254\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:57:28.978303Z [\u001B[32minfo ] Starting training dataset=3112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:58:28.386104Z [\u001B[32minfo ] Training complete train_loss=0.9859296679496765\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T23:58:28.392510Z [\u001B[32minfo ] Starting training dataset=3112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T23:59:28.085509Z [\u001B[32minfo ] Training complete train_loss=0.911634624004364\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T23:59:28.090740Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T23:59:41.378569Z [\u001B[32minfo ] Evaluation complete test_loss=0.6155526041984558\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T23:59:41.388091Z [\u001B[32minfo ] Start Predict dataset=46888\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:15:27.000611Z [\u001B[32minfo ] Starting training dataset=3212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:16:27.485323Z [\u001B[32minfo ] Training complete train_loss=0.9690141081809998\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:16:27.492326Z [\u001B[32minfo ] Starting training dataset=3212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:17:27.971410Z [\u001B[32minfo ] Training complete train_loss=0.9362989664077759\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:17:27.978854Z [\u001B[32minfo ] Starting training dataset=3212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:18:28.374476Z [\u001B[32minfo ] Training complete train_loss=0.9801816940307617\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:18:28.381476Z [\u001B[32minfo ] Starting training dataset=3212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:19:28.688675Z [\u001B[32minfo ] Training complete train_loss=0.982334315776825\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:19:28.695882Z [\u001B[32minfo ] Starting training dataset=3212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:20:29.280950Z [\u001B[32minfo ] Training complete train_loss=1.0058739185333252\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T00:20:29.285715Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T00:20:42.576387Z [\u001B[32minfo ] Evaluation complete test_loss=0.624253511428833\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T00:20:42.582922Z [\u001B[32minfo ] Start Predict dataset=46788\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:36:24.397328Z [\u001B[32minfo ] Starting training dataset=3312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:37:25.777847Z [\u001B[32minfo ] Training complete train_loss=0.9525146484375\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:37:25.785148Z [\u001B[32minfo ] Starting training dataset=3312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:38:27.670850Z [\u001B[32minfo ] Training complete train_loss=0.9617348909378052\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:38:27.678614Z [\u001B[32minfo ] Starting training dataset=3312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:39:29.474158Z [\u001B[32minfo ] Training complete train_loss=0.9088602066040039\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:39:29.480853Z [\u001B[32minfo ] Starting training dataset=3312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:40:31.282180Z [\u001B[32minfo ] Training complete train_loss=0.941899836063385\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:40:31.288671Z [\u001B[32minfo ] Starting training dataset=3312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:41:32.674235Z [\u001B[32minfo ] Training complete train_loss=0.9036476612091064\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T00:41:32.679012Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T00:41:45.880151Z [\u001B[32minfo ] Evaluation complete test_loss=0.6003412008285522\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T00:41:45.886711Z [\u001B[32minfo ] Start Predict dataset=46688\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:57:25.233096Z [\u001B[32minfo ] Starting training dataset=3412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:58:27.771124Z [\u001B[32minfo ] Training complete train_loss=0.9652302861213684\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:58:27.778539Z [\u001B[32minfo ] Starting training dataset=3412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T00:59:29.682729Z [\u001B[32minfo ] Training complete train_loss=0.9133856892585754\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T00:59:29.689246Z [\u001B[32minfo ] Starting training dataset=3412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:00:32.276974Z [\u001B[32minfo ] Training complete train_loss=0.9120355248451233\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:00:32.285180Z [\u001B[32minfo ] Starting training dataset=3412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:01:34.571251Z [\u001B[32minfo ] Training complete train_loss=0.9987239241600037\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:01:34.579233Z [\u001B[32minfo ] Starting training dataset=3412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:02:36.971180Z [\u001B[32minfo ] Training complete train_loss=1.0511140823364258\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T01:02:36.976140Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T01:02:50.389403Z [\u001B[32minfo ] Evaluation complete test_loss=0.600139319896698\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T01:02:50.466974Z [\u001B[32minfo ] Start Predict dataset=46588\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:18:50.039255Z [\u001B[32minfo ] Starting training dataset=3512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:19:55.272920Z [\u001B[32minfo ] Training complete train_loss=0.9255532026290894\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:19:55.280740Z [\u001B[32minfo ] Starting training dataset=3512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:21:00.783239Z [\u001B[32minfo ] Training complete train_loss=0.9507350325584412\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:21:00.791110Z [\u001B[32minfo ] Starting training dataset=3512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:22:06.082850Z [\u001B[32minfo ] Training complete train_loss=0.9777060151100159\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:22:06.090186Z [\u001B[32minfo ] Starting training dataset=3512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:23:11.572780Z [\u001B[32minfo ] Training complete train_loss=0.9614008069038391\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:23:11.581010Z [\u001B[32minfo ] Starting training dataset=3512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:24:17.388319Z [\u001B[32minfo ] Training complete train_loss=0.9385112524032593\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T01:24:17.464897Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T01:24:31.274227Z [\u001B[32minfo ] Evaluation complete test_loss=0.5854552388191223\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T01:24:31.281479Z [\u001B[32minfo ] Start Predict dataset=46488\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:40:39.084449Z [\u001B[32minfo ] Starting training dataset=3612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:41:51.372209Z [\u001B[32minfo ] Training complete train_loss=0.9902898669242859\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:41:51.384139Z [\u001B[32minfo ] Starting training dataset=3612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:43:04.776776Z [\u001B[32minfo ] Training complete train_loss=0.9506107568740845\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:43:04.784669Z [\u001B[32minfo ] Starting training dataset=3612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:44:16.971692Z [\u001B[32minfo ] Training complete train_loss=0.9101556539535522\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:44:16.980149Z [\u001B[32minfo ] Starting training dataset=3612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:45:30.171964Z [\u001B[32minfo ] Training complete train_loss=0.9209990501403809\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T01:45:30.180936Z [\u001B[32minfo ] Starting training dataset=3612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T01:46:41.988783Z [\u001B[32minfo ] Training complete train_loss=0.9617601633071899\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T01:46:41.995001Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T01:46:56.588932Z [\u001B[32minfo ] Evaluation complete test_loss=0.588925838470459\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T01:46:56.596366Z [\u001B[32minfo ] Start Predict dataset=46388\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:03:02.545742Z [\u001B[32minfo ] Starting training dataset=3712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:04:10.991458Z [\u001B[32minfo ] Training complete train_loss=0.902265727519989\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:04:11.070627Z [\u001B[32minfo ] Starting training dataset=3712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:05:20.275718Z [\u001B[32minfo ] Training complete train_loss=0.9274501204490662\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:05:20.283525Z [\u001B[32minfo ] Starting training dataset=3712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:06:30.384887Z [\u001B[32minfo ] Training complete train_loss=0.948907732963562\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:06:30.392616Z [\u001B[32minfo ] Starting training dataset=3712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:07:40.073581Z [\u001B[32minfo ] Training complete train_loss=0.9361139535903931\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:07:40.081202Z [\u001B[32minfo ] Starting training dataset=3712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:08:49.870122Z [\u001B[32minfo ] Training complete train_loss=0.9426257014274597\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T02:08:49.876067Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T02:09:03.779674Z [\u001B[32minfo ] Evaluation complete test_loss=0.5798165798187256\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T02:09:03.787281Z [\u001B[32minfo ] Start Predict dataset=46288\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:25:00.390871Z [\u001B[32minfo ] Starting training dataset=3812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:26:08.266439Z [\u001B[32minfo ] Training complete train_loss=0.9778515100479126\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:26:08.273235Z [\u001B[32minfo ] Starting training dataset=3812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:27:16.871698Z [\u001B[32minfo ] Training complete train_loss=0.9607134461402893\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:27:16.879594Z [\u001B[32minfo ] Starting training dataset=3812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:28:25.190132Z [\u001B[32minfo ] Training complete train_loss=1.0153460502624512\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:28:25.268855Z [\u001B[32minfo ] Starting training dataset=3812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:29:34.683644Z [\u001B[32minfo ] Training complete train_loss=1.0046292543411255\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:29:34.692586Z [\u001B[32minfo ] Starting training dataset=3812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:30:44.576145Z [\u001B[32minfo ] Training complete train_loss=0.9653735756874084\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T02:30:44.582048Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T02:30:58.377427Z [\u001B[32minfo ] Evaluation complete test_loss=0.5952193140983582\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T02:30:58.384668Z [\u001B[32minfo ] Start Predict dataset=46188\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:46:37.602564Z [\u001B[32minfo ] Starting training dataset=3912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:47:48.374654Z [\u001B[32minfo ] Training complete train_loss=0.9733763933181763\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:47:48.381969Z [\u001B[32minfo ] Starting training dataset=3912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:48:58.977107Z [\u001B[32minfo ] Training complete train_loss=0.9541704058647156\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:48:58.984005Z [\u001B[32minfo ] Starting training dataset=3912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:50:07.770611Z [\u001B[32minfo ] Training complete train_loss=0.8996275067329407\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:50:07.777522Z [\u001B[32minfo ] Starting training dataset=3912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:51:17.876229Z [\u001B[32minfo ] Training complete train_loss=0.9932214617729187\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T02:51:17.883210Z [\u001B[32minfo ] Starting training dataset=3912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T02:52:26.082136Z [\u001B[32minfo ] Training complete train_loss=1.0322301387786865\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T02:52:26.087786Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T02:52:39.383655Z [\u001B[32minfo ] Evaluation complete test_loss=0.5735118985176086\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T02:52:39.391237Z [\u001B[32minfo ] Start Predict dataset=46088\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:08:19.372014Z [\u001B[32minfo ] Starting training dataset=4012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:09:30.573365Z [\u001B[32minfo ] Training complete train_loss=0.8765227794647217\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:09:30.580674Z [\u001B[32minfo ] Starting training dataset=4012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:10:40.077172Z [\u001B[32minfo ] Training complete train_loss=0.9590541124343872\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:10:40.084249Z [\u001B[32minfo ] Starting training dataset=4012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:11:51.185509Z [\u001B[32minfo ] Training complete train_loss=0.9163060784339905\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:11:51.192620Z [\u001B[32minfo ] Starting training dataset=4012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:13:01.870247Z [\u001B[32minfo ] Training complete train_loss=0.9452018737792969\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:13:01.878052Z [\u001B[32minfo ] Starting training dataset=4012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:14:12.270683Z [\u001B[32minfo ] Training complete train_loss=0.9404565095901489\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T03:14:12.276100Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T03:14:25.490276Z [\u001B[32minfo ] Evaluation complete test_loss=0.562659740447998\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T03:14:25.569134Z [\u001B[32minfo ] Start Predict dataset=45988\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:30:03.646820Z [\u001B[32minfo ] Starting training dataset=4112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:31:13.466228Z [\u001B[32minfo ] Training complete train_loss=1.0031845569610596\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:31:13.472767Z [\u001B[32minfo ] Starting training dataset=4112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:32:23.173042Z [\u001B[32minfo ] Training complete train_loss=0.9533597230911255\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:32:23.179884Z [\u001B[32minfo ] Starting training dataset=4112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:33:33.078510Z [\u001B[32minfo ] Training complete train_loss=0.9691417217254639\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:33:33.084899Z [\u001B[32minfo ] Starting training dataset=4112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:34:42.771816Z [\u001B[32minfo ] Training complete train_loss=0.9820665717124939\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:34:42.778569Z [\u001B[32minfo ] Starting training dataset=4112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:35:52.866703Z [\u001B[32minfo ] Training complete train_loss=0.9796773195266724\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T03:35:52.871526Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T03:36:05.879288Z [\u001B[32minfo ] Evaluation complete test_loss=0.5647993087768555\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T03:36:05.885746Z [\u001B[32minfo ] Start Predict dataset=45888\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:51:29.489449Z [\u001B[32minfo ] Starting training dataset=4212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:52:40.770569Z [\u001B[32minfo ] Training complete train_loss=0.936839759349823\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:52:40.778765Z [\u001B[32minfo ] Starting training dataset=4212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:55:02.783307Z [\u001B[32minfo ] Training complete train_loss=0.9478463530540466\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:55:02.789775Z [\u001B[32minfo ] Starting training dataset=4212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:56:14.090717Z [\u001B[32minfo ] Training complete train_loss=0.9612425565719604\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T03:56:14.168573Z [\u001B[32minfo ] Starting training dataset=4212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T03:57:24.880913Z [\u001B[32minfo ] Training complete train_loss=0.9398402571678162\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T03:57:24.885770Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T03:57:38.175341Z [\u001B[32minfo ] Evaluation complete test_loss=0.5565947890281677\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T03:57:38.182963Z [\u001B[32minfo ] Start Predict dataset=45788\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:12:59.505991Z [\u001B[32minfo ] Starting training dataset=4312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:14:11.385840Z [\u001B[32minfo ] Training complete train_loss=1.0070544481277466\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:14:11.393299Z [\u001B[32minfo ] Starting training dataset=4312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:15:23.470047Z [\u001B[32minfo ] Training complete train_loss=0.9844921231269836\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:15:23.476889Z [\u001B[32minfo ] Starting training dataset=4312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:16:35.675671Z [\u001B[32minfo ] Training complete train_loss=0.9934103488922119\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:16:35.682764Z [\u001B[32minfo ] Starting training dataset=4312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:17:50.472318Z [\u001B[32minfo ] Training complete train_loss=0.9531852602958679\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:17:50.479302Z [\u001B[32minfo ] Starting training dataset=4312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:19:05.178550Z [\u001B[32minfo ] Training complete train_loss=0.9936967492103577\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T04:19:05.184399Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T04:19:18.580249Z [\u001B[32minfo ] Evaluation complete test_loss=0.5665599703788757\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T04:19:18.586943Z [\u001B[32minfo ] Start Predict dataset=45688\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:34:51.766049Z [\u001B[32minfo ] Starting training dataset=4412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:36:06.384482Z [\u001B[32minfo ] Training complete train_loss=0.9465316534042358\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:36:06.391749Z [\u001B[32minfo ] Starting training dataset=4412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:37:21.270006Z [\u001B[32minfo ] Training complete train_loss=0.9687955379486084\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:37:21.277308Z [\u001B[32minfo ] Starting training dataset=4412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:38:34.181247Z [\u001B[32minfo ] Training complete train_loss=0.9473304152488708\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:38:34.188378Z [\u001B[32minfo ] Starting training dataset=4412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:39:48.682254Z [\u001B[32minfo ] Training complete train_loss=0.9860917329788208\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:39:48.688619Z [\u001B[32minfo ] Starting training dataset=4412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:41:01.381298Z [\u001B[32minfo ] Training complete train_loss=0.9456995129585266\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T04:41:01.386213Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T04:41:14.570893Z [\u001B[32minfo ] Evaluation complete test_loss=0.5559056997299194\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T04:41:14.578396Z [\u001B[32minfo ] Start Predict dataset=45588\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:56:32.245143Z [\u001B[32minfo ] Starting training dataset=4512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:57:46.188021Z [\u001B[32minfo ] Training complete train_loss=0.9349108934402466\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:57:46.195296Z [\u001B[32minfo ] Starting training dataset=4512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T04:59:00.277858Z [\u001B[32minfo ] Training complete train_loss=0.974810779094696\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T04:59:00.284605Z [\u001B[32minfo ] Starting training dataset=4512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:00:14.582182Z [\u001B[32minfo ] Training complete train_loss=0.9310106635093689\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:00:14.589957Z [\u001B[32minfo ] Starting training dataset=4512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:01:29.273130Z [\u001B[32minfo ] Training complete train_loss=0.9381789565086365\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:01:29.279718Z [\u001B[32minfo ] Starting training dataset=4512 epoch=10\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:18:11.286794Z [\u001B[32minfo ] Starting training dataset=4612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:19:27.672247Z [\u001B[32minfo ] Training complete train_loss=0.9956920742988586\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:19:27.679533Z [\u001B[32minfo ] Starting training dataset=4612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:20:44.580321Z [\u001B[32minfo ] Training complete train_loss=0.9173332452774048\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:20:44.587566Z [\u001B[32minfo ] Starting training dataset=4612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:22:01.084456Z [\u001B[32minfo ] Training complete train_loss=0.9863068461418152\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:22:01.091486Z [\u001B[32minfo ] Starting training dataset=4612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:23:18.674201Z [\u001B[32minfo ] Training complete train_loss=0.9764572381973267\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:23:18.681107Z [\u001B[32minfo ] Starting training dataset=4612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:24:35.480255Z [\u001B[32minfo ] Training complete train_loss=1.0168014764785767\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T05:24:35.485128Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T05:24:48.874589Z [\u001B[32minfo ] Evaluation complete test_loss=0.5520719289779663\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T05:24:48.881463Z [\u001B[32minfo ] Start Predict dataset=45388\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:40:04.304007Z [\u001B[32minfo ] Starting training dataset=4712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:41:21.974405Z [\u001B[32minfo ] Training complete train_loss=0.9784602522850037\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:41:21.981149Z [\u001B[32minfo ] Starting training dataset=4712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:42:39.381711Z [\u001B[32minfo ] Training complete train_loss=0.9848698973655701\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:42:39.388686Z [\u001B[32minfo ] Starting training dataset=4712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:43:57.482323Z [\u001B[32minfo ] Training complete train_loss=0.9804074168205261\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:43:57.489593Z [\u001B[32minfo ] Starting training dataset=4712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:45:14.773833Z [\u001B[32minfo ] Training complete train_loss=0.9591724276542664\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T05:45:14.781518Z [\u001B[32minfo ] Starting training dataset=4712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T05:46:33.281245Z [\u001B[32minfo ] Training complete train_loss=1.0082279443740845\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T05:46:33.286733Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T05:46:46.591746Z [\u001B[32minfo ] Evaluation complete test_loss=0.5615522861480713\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T05:46:46.665617Z [\u001B[32minfo ] Start Predict dataset=45288\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:01:59.899100Z [\u001B[32minfo ] Starting training dataset=4812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:03:19.880195Z [\u001B[32minfo ] Training complete train_loss=0.9809603691101074\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:03:19.888224Z [\u001B[32minfo ] Starting training dataset=4812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:04:39.375615Z [\u001B[32minfo ] Training complete train_loss=0.973745584487915\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:04:39.383822Z [\u001B[32minfo ] Starting training dataset=4812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:05:58.485925Z [\u001B[32minfo ] Training complete train_loss=0.9743210673332214\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:05:58.493312Z [\u001B[32minfo ] Starting training dataset=4812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:07:17.076932Z [\u001B[32minfo ] Training complete train_loss=0.91501784324646\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:07:17.083588Z [\u001B[32minfo ] Starting training dataset=4812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:08:35.989119Z [\u001B[32minfo ] Training complete train_loss=0.9752476215362549\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T06:08:35.995117Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T06:08:49.377772Z [\u001B[32minfo ] Evaluation complete test_loss=0.5705881118774414\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T06:08:49.385668Z [\u001B[32minfo ] Start Predict dataset=45188\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:23:59.766065Z [\u001B[32minfo ] Starting training dataset=4912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:25:20.285377Z [\u001B[32minfo ] Training complete train_loss=0.9525318741798401\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:25:20.293520Z [\u001B[32minfo ] Starting training dataset=4912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:26:39.891338Z [\u001B[32minfo ] Training complete train_loss=0.9308268427848816\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:26:39.970277Z [\u001B[32minfo ] Starting training dataset=4912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:27:59.877774Z [\u001B[32minfo ] Training complete train_loss=0.9380195140838623\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:27:59.885615Z [\u001B[32minfo ] Starting training dataset=4912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:29:19.772513Z [\u001B[32minfo ] Training complete train_loss=0.9800739288330078\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:29:19.780060Z [\u001B[32minfo ] Starting training dataset=4912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:30:40.389743Z [\u001B[32minfo ] Training complete train_loss=0.9535344243049622\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T06:30:40.395101Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T06:30:53.871705Z [\u001B[32minfo ] Evaluation complete test_loss=0.5377533435821533\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T06:30:53.879359Z [\u001B[32minfo ] Start Predict dataset=45088\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:46:03.495772Z [\u001B[32minfo ] Starting training dataset=5012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:47:24.584999Z [\u001B[32minfo ] Training complete train_loss=0.9156500697135925\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:47:24.591922Z [\u001B[32minfo ] Starting training dataset=5012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:48:45.286743Z [\u001B[32minfo ] Training complete train_loss=0.9553552269935608\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:48:45.294576Z [\u001B[32minfo ] Starting training dataset=5012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:50:05.775591Z [\u001B[32minfo ] Training complete train_loss=0.9296905994415283\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:50:05.783333Z [\u001B[32minfo ] Starting training dataset=5012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:51:26.690279Z [\u001B[32minfo ] Training complete train_loss=0.9139729738235474\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T06:51:26.697377Z [\u001B[32minfo ] Starting training dataset=5012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T06:52:47.585409Z [\u001B[32minfo ] Training complete train_loss=0.9346900582313538\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T06:52:47.591531Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T06:53:00.973940Z [\u001B[32minfo ] Evaluation complete test_loss=0.5232065320014954\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T06:53:00.981762Z [\u001B[32minfo ] Start Predict dataset=44988\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:08:07.516317Z [\u001B[32minfo ] Starting training dataset=5112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:09:28.877012Z [\u001B[32minfo ] Training complete train_loss=0.9632118940353394\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:09:28.884978Z [\u001B[32minfo ] Starting training dataset=5112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:10:50.481289Z [\u001B[32minfo ] Training complete train_loss=0.8998667597770691\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:10:50.489168Z [\u001B[32minfo ] Starting training dataset=5112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:12:12.371390Z [\u001B[32minfo ] Training complete train_loss=0.9703662991523743\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:12:12.378297Z [\u001B[32minfo ] Starting training dataset=5112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:13:34.783263Z [\u001B[32minfo ] Training complete train_loss=0.911615252494812\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:13:34.790932Z [\u001B[32minfo ] Starting training dataset=5112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:14:57.593475Z [\u001B[32minfo ] Training complete train_loss=0.9733651280403137\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T07:14:57.666507Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T07:15:10.972475Z [\u001B[32minfo ] Evaluation complete test_loss=0.5270566344261169\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T07:15:10.979908Z [\u001B[32minfo ] Start Predict dataset=44888\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:30:15.747722Z [\u001B[32minfo ] Starting training dataset=5212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:31:38.673276Z [\u001B[32minfo ] Training complete train_loss=0.8865935802459717\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:31:38.680685Z [\u001B[32minfo ] Starting training dataset=5212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:33:01.991828Z [\u001B[32minfo ] Training complete train_loss=0.9334018230438232\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:33:02.072821Z [\u001B[32minfo ] Starting training dataset=5212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:34:24.690463Z [\u001B[32minfo ] Training complete train_loss=0.9533438682556152\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:34:24.765045Z [\u001B[32minfo ] Starting training dataset=5212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:35:48.084328Z [\u001B[32minfo ] Training complete train_loss=0.9691643118858337\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:35:48.091523Z [\u001B[32minfo ] Starting training dataset=5212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:37:10.381950Z [\u001B[32minfo ] Training complete train_loss=0.9655492305755615\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T07:37:10.387022Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T07:37:23.881587Z [\u001B[32minfo ] Evaluation complete test_loss=0.5173717141151428\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T07:37:23.888603Z [\u001B[32minfo ] Start Predict dataset=44788\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:52:26.318401Z [\u001B[32minfo ] Starting training dataset=5312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:53:50.090727Z [\u001B[32minfo ] Training complete train_loss=0.9988188147544861\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:53:50.166640Z [\u001B[32minfo ] Starting training dataset=5312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:55:13.680111Z [\u001B[32minfo ] Training complete train_loss=0.9845055341720581\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:55:13.687638Z [\u001B[32minfo ] Starting training dataset=5312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:56:38.190458Z [\u001B[32minfo ] Training complete train_loss=0.9391801357269287\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:56:38.268577Z [\u001B[32minfo ] Starting training dataset=5312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:58:02.480110Z [\u001B[32minfo ] Training complete train_loss=0.9259323477745056\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T07:58:02.486874Z [\u001B[32minfo ] Starting training dataset=5312 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T07:59:26.789141Z [\u001B[32minfo ] Training complete train_loss=0.9599472284317017\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T07:59:26.794346Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T07:59:40.283603Z [\u001B[32minfo ] Evaluation complete test_loss=0.5252811908721924\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T07:59:40.292579Z [\u001B[32minfo ] Start Predict dataset=44688\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:14:41.452168Z [\u001B[32minfo ] Starting training dataset=5412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:16:06.479870Z [\u001B[32minfo ] Training complete train_loss=0.9627469182014465\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:16:06.487092Z [\u001B[32minfo ] Starting training dataset=5412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:17:32.184454Z [\u001B[32minfo ] Training complete train_loss=0.9845385551452637\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:17:32.193680Z [\u001B[32minfo ] Starting training dataset=5412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:18:58.381604Z [\u001B[32minfo ] Training complete train_loss=0.9816064238548279\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:18:58.389140Z [\u001B[32minfo ] Starting training dataset=5412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:20:23.482114Z [\u001B[32minfo ] Training complete train_loss=0.9480637311935425\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:20:23.489218Z [\u001B[32minfo ] Starting training dataset=5412 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:21:48.982720Z [\u001B[32minfo ] Training complete train_loss=0.9441766142845154\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T08:21:48.988076Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T08:22:02.376927Z [\u001B[32minfo ] Evaluation complete test_loss=0.5122457146644592\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T08:22:02.384138Z [\u001B[32minfo ] Start Predict dataset=44588\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:36:59.422635Z [\u001B[32minfo ] Starting training dataset=5512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:38:25.077782Z [\u001B[32minfo ] Training complete train_loss=0.9657859206199646\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:38:25.084857Z [\u001B[32minfo ] Starting training dataset=5512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:39:51.477342Z [\u001B[32minfo ] Training complete train_loss=0.9360918402671814\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:39:51.483954Z [\u001B[32minfo ] Starting training dataset=5512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:41:17.575167Z [\u001B[32minfo ] Training complete train_loss=0.938201904296875\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:41:17.582039Z [\u001B[32minfo ] Starting training dataset=5512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:42:44.375687Z [\u001B[32minfo ] Training complete train_loss=0.9420015811920166\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:42:44.383021Z [\u001B[32minfo ] Starting training dataset=5512 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T08:44:09.980322Z [\u001B[32minfo ] Training complete train_loss=0.9979342222213745\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T08:44:09.985577Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T08:44:23.583289Z [\u001B[32minfo ] Evaluation complete test_loss=0.5096003413200378\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T08:44:23.592039Z [\u001B[32minfo ] Start Predict dataset=44488\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T08:59:20.005780Z [\u001B[32minfo ] Starting training dataset=5612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:00:47.983898Z [\u001B[32minfo ] Training complete train_loss=0.9347040057182312\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:00:47.992177Z [\u001B[32minfo ] Starting training dataset=5612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:02:14.675232Z [\u001B[32minfo ] Training complete train_loss=0.9421486258506775\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:02:14.682607Z [\u001B[32minfo ] Starting training dataset=5612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:03:42.471977Z [\u001B[32minfo ] Training complete train_loss=0.9339758157730103\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:03:42.478964Z [\u001B[32minfo ] Starting training dataset=5612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:05:09.588801Z [\u001B[32minfo ] Training complete train_loss=0.940785825252533\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:05:09.596157Z [\u001B[32minfo ] Starting training dataset=5612 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:06:36.388508Z [\u001B[32minfo ] Training complete train_loss=0.9172220230102539\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T09:06:36.394159Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T09:06:49.779906Z [\u001B[32minfo ] Evaluation complete test_loss=0.5034990906715393\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T09:06:49.787079Z [\u001B[32minfo ] Start Predict dataset=44388\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:21:42.877712Z [\u001B[32minfo ] Starting training dataset=5712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:23:11.072422Z [\u001B[32minfo ] Training complete train_loss=0.9773575663566589\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:23:11.080054Z [\u001B[32minfo ] Starting training dataset=5712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:24:39.979758Z [\u001B[32minfo ] Training complete train_loss=0.9121693968772888\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:24:39.986819Z [\u001B[32minfo ] Starting training dataset=5712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:26:08.273716Z [\u001B[32minfo ] Training complete train_loss=0.9608463644981384\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:26:08.282000Z [\u001B[32minfo ] Starting training dataset=5712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:27:36.386442Z [\u001B[32minfo ] Training complete train_loss=0.9310780167579651\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:27:36.393392Z [\u001B[32minfo ] Starting training dataset=5712 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:29:04.675088Z [\u001B[32minfo ] Training complete train_loss=0.9387174844741821\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T09:29:04.681026Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T09:29:18.473080Z [\u001B[32minfo ] Evaluation complete test_loss=0.502855658531189\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T09:29:18.480180Z [\u001B[32minfo ] Start Predict dataset=44288\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:44:11.984106Z [\u001B[32minfo ] Starting training dataset=5812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:45:41.684589Z [\u001B[32minfo ] Training complete train_loss=0.9386858940124512\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:45:41.691708Z [\u001B[32minfo ] Starting training dataset=5812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:47:11.889461Z [\u001B[32minfo ] Training complete train_loss=0.9746957421302795\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:47:11.896628Z [\u001B[32minfo ] Starting training dataset=5812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:48:41.878885Z [\u001B[32minfo ] Training complete train_loss=0.9614348411560059\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:48:41.885215Z [\u001B[32minfo ] Starting training dataset=5812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:50:10.674407Z [\u001B[32minfo ] Training complete train_loss=0.9561299681663513\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T09:50:10.680954Z [\u001B[32minfo ] Starting training dataset=5812 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T09:51:40.788730Z [\u001B[32minfo ] Training complete train_loss=0.9587898850440979\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T09:51:40.794217Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T09:51:54.684919Z [\u001B[32minfo ] Evaluation complete test_loss=0.506767749786377\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T09:51:54.692014Z [\u001B[32minfo ] Start Predict dataset=44188\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:06:42.728074Z [\u001B[32minfo ] Starting training dataset=5912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:08:13.492879Z [\u001B[32minfo ] Training complete train_loss=0.9357155561447144\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:08:13.571886Z [\u001B[32minfo ] Starting training dataset=5912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:09:43.775531Z [\u001B[32minfo ] Training complete train_loss=0.928965151309967\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:09:43.783025Z [\u001B[32minfo ] Starting training dataset=5912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:11:15.290185Z [\u001B[32minfo ] Training complete train_loss=0.9106123447418213\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:11:15.366495Z [\u001B[32minfo ] Starting training dataset=5912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:12:46.477856Z [\u001B[32minfo ] Training complete train_loss=0.9481375813484192\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:12:46.484984Z [\u001B[32minfo ] Starting training dataset=5912 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:14:20.480515Z [\u001B[32minfo ] Training complete train_loss=0.9085630774497986\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T10:14:20.485748Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T10:14:34.189367Z [\u001B[32minfo ] Evaluation complete test_loss=0.4975428879261017\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T10:14:34.198983Z [\u001B[32minfo ] Start Predict dataset=44088\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:29:23.082284Z [\u001B[32minfo ] Starting training dataset=6012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:30:54.879188Z [\u001B[32minfo ] Training complete train_loss=0.9179600477218628\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:30:54.885979Z [\u001B[32minfo ] Starting training dataset=6012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:32:27.188698Z [\u001B[32minfo ] Training complete train_loss=0.954762876033783\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:32:27.196154Z [\u001B[32minfo ] Starting training dataset=6012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:33:58.874241Z [\u001B[32minfo ] Training complete train_loss=0.9262583255767822\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:33:58.881235Z [\u001B[32minfo ] Starting training dataset=6012 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:35:30.088966Z [\u001B[32minfo ] Training complete train_loss=0.9067783951759338\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:35:30.096937Z [\u001B[32minfo ] Starting training dataset=6012 epoch=10\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:51:59.151384Z [\u001B[32minfo ] Starting training dataset=6112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:53:36.383207Z [\u001B[32minfo ] Training complete train_loss=0.9925647974014282\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:53:36.391280Z [\u001B[32minfo ] Starting training dataset=6112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:55:12.673108Z [\u001B[32minfo ] Training complete train_loss=0.9393655061721802\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:55:12.681112Z [\u001B[32minfo ] Starting training dataset=6112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:56:49.685065Z [\u001B[32minfo ] Training complete train_loss=0.9082939028739929\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:56:49.692182Z [\u001B[32minfo ] Starting training dataset=6112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T10:58:26.181962Z [\u001B[32minfo ] Training complete train_loss=0.9237620830535889\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T10:58:26.189559Z [\u001B[32minfo ] Starting training dataset=6112 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T11:00:00.983051Z [\u001B[32minfo ] Training complete train_loss=0.9448640942573547\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T11:00:00.988386Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T11:00:14.479037Z [\u001B[32minfo ] Evaluation complete test_loss=0.5105274319648743\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T11:00:14.485896Z [\u001B[32minfo ] Start Predict dataset=43888\n",
- "Training model 0\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T11:14:56.388463Z [\u001B[32minfo ] Starting training dataset=6212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T11:16:28.581044Z [\u001B[32minfo ] Training complete train_loss=0.9154177308082581\n",
- "Training model 1\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T11:16:28.587949Z [\u001B[32minfo ] Starting training dataset=6212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T11:18:00.773198Z [\u001B[32minfo ] Training complete train_loss=0.9639526605606079\n",
- "Training model 2\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T11:18:00.780174Z [\u001B[32minfo ] Starting training dataset=6212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T11:19:33.565889Z [\u001B[32minfo ] Training complete train_loss=1.0114784240722656\n",
- "Training model 3\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T11:19:33.573952Z [\u001B[32minfo ] Starting training dataset=6212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T11:21:08.172122Z [\u001B[32minfo ] Training complete train_loss=0.932121217250824\n",
- "Training model 4\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:109] 2021-07-29T11:21:08.179191Z [\u001B[32minfo ] Starting training dataset=6212 epoch=10\n",
- "[14095-MainThread] [baal.modelwrapper:train_on_dataset:119] 2021-07-29T11:22:41.977505Z [\u001B[32minfo ] Training complete train_loss=0.9630128741264343\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:147] 2021-07-29T11:22:41.984938Z [\u001B[32minfo ] Starting evaluating dataset=10000\n",
- "[14095-MainThread] [baal.modelwrapper:test_on_dataset:156] 2021-07-29T11:22:55.375432Z [\u001B[32minfo ] Evaluation complete test_loss=0.4921897053718567\n",
- "[14095-MainThread] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-29T11:22:55.382629Z [\u001B[32minfo ] Start Predict dataset=43788\n"
- ]
+ "execution_count": null,
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n",
+ "is_executing": true
}
- ],
+ },
+ "outputs": [],
"source": [
"report = []\n",
"for epoch in tqdm(range(hyperparams.epoch)):\n",
@@ -1297,7 +252,11 @@
{
"cell_type": "code",
"execution_count": 5,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
diff --git a/notebooks/fairness/ActiveFairness.ipynb b/notebooks/fairness/ActiveFairness.ipynb
index 08e909b1..168ed79d 100644
--- a/notebooks/fairness/ActiveFairness.ipynb
+++ b/notebooks/fairness/ActiveFairness.ipynb
@@ -2,7 +2,11 @@
"cells": [
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## Can active learning preemptively mitigate fairness issues?\n",
"*By Parmida Atighehchian*\n",
@@ -27,7 +31,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"### Introducing bias in dataset\n",
"\n",
@@ -45,7 +53,11 @@
{
"cell_type": "code",
"execution_count": 1,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stderr",
@@ -109,7 +121,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"### Prepare model and dataset to be used in BaaL setup\n",
"As usual we wrap the train_set in `ActiveLearningDataset` and using vgg16 as default model, we use the BaaL's `patch_module` to create a dropout layer which performs in inference time."
@@ -118,7 +134,11 @@
{
"cell_type": "code",
"execution_count": 3,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from torchvision.transforms import transforms\n",
@@ -168,7 +188,11 @@
{
"cell_type": "code",
"execution_count": 4,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from torchvision import models\n",
@@ -192,7 +216,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"We wrap the pytorch criterion to accomodate target being a dictionary."
]
@@ -200,7 +228,11 @@
{
"cell_type": "code",
"execution_count": 5,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from torch import nn\n",
@@ -216,7 +248,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"#### Training\n",
"\n",
@@ -229,7 +265,10 @@
"metadata": {
"tags": [
"no_output"
- ]
+ ],
+ "pycharm": {
+ "name": "#%%\n"
+ }
},
"outputs": [],
"source": [
@@ -317,7 +356,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"### Results and Discussion\n",
"\n",
@@ -328,7 +371,10 @@
"cell_type": "code",
"execution_count": 17,
"metadata": {
- "scrolled": false
+ "scrolled": false,
+ "pycharm": {
+ "name": "#%%\n"
+ }
},
"outputs": [
{
@@ -370,7 +416,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"We demonstrate the `test_loss` and `training_size` using `bald` vs `random` as heuristics. As it is shown, the trainig size increases with the same pace but the above graphs shows the underlying difference in the existing samples for each class which then results in also a better loss decrease using `bald`."
]
@@ -378,7 +428,11 @@
{
"cell_type": "code",
"execution_count": 16,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -431,4 +485,4 @@
},
"nbformat": 4,
"nbformat_minor": 4
-}
+}
\ No newline at end of file
diff --git a/notebooks/fundamentals/active-learning.ipynb b/notebooks/fundamentals/active-learning.ipynb
index 4072836d..068e39fb 100644
--- a/notebooks/fundamentals/active-learning.ipynb
+++ b/notebooks/fundamentals/active-learning.ipynb
@@ -2,7 +2,11 @@
"cells": [
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"# Active learning infrastructure objects\n",
"\n",
@@ -27,7 +31,11 @@
{
"cell_type": "code",
"execution_count": 2,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"path = \"/Users/jan/datasets/mnist/\""
@@ -36,7 +44,11 @@
{
"cell_type": "code",
"execution_count": 4,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from torchvision import transforms, datasets\n",
@@ -53,7 +65,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"As you can see, this is a fairly thin wrapper around MNIST. But, we can now\n",
"check several new properties of this dataset:\n"
@@ -62,7 +78,11 @@
{
"cell_type": "code",
"execution_count": 5,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -82,7 +102,11 @@
{
"cell_type": "code",
"execution_count": 6,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -101,7 +125,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"We can also start labelling data. Either randomly, or based on specific indices:"
]
@@ -109,7 +137,11 @@
{
"cell_type": "code",
"execution_count": 7,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"active_mnist.label_randomly(10)\n",
@@ -118,7 +150,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"We've just labelled 10 points randomly, and 3 points based on specific indices.\n",
"Now, if we check how many have been labelled, we see that 13 have been labelled:\n"
@@ -127,7 +163,11 @@
{
"cell_type": "code",
"execution_count": 8,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -147,7 +187,11 @@
{
"cell_type": "code",
"execution_count": 9,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -166,7 +210,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"We will also see that when we check the length of this dataset - something that\n",
"is done by e.g. pytorch `DataLoader` classes - it only gives the length of the\n",
@@ -176,7 +224,11 @@
{
"cell_type": "code",
"execution_count": 12,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -195,7 +247,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"And, if we try to access an item, it will only allow us to index the _labelled_\n",
"datapoints:\n"
@@ -204,7 +260,11 @@
{
"cell_type": "code",
"execution_count": 13,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -246,7 +306,11 @@
{
"cell_type": "code",
"execution_count": 14,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"ename": "IndexError",
@@ -268,7 +332,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"Instead, if we want to actually use the _unlabelled_ data, we need to use the\n",
"`pool` attribute of the active learning dataset, which is itself a dataset:\n"
@@ -277,7 +345,11 @@
{
"cell_type": "code",
"execution_count": 15,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
diff --git a/notebooks/fundamentals/posteriors.ipynb b/notebooks/fundamentals/posteriors.ipynb
index 68467cff..0acb9254 100644
--- a/notebooks/fundamentals/posteriors.ipynb
+++ b/notebooks/fundamentals/posteriors.ipynb
@@ -2,7 +2,11 @@
"cells": [
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"# Methods for approximating bayesian posteriors \n",
"\n",
@@ -20,7 +24,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## Monte-Carlo Dropout\n",
"\n",
@@ -38,7 +46,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"### Usage\n",
"\n",
@@ -48,10 +60,15 @@
{
"cell_type": "code",
"execution_count": 1,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"import torch\n",
+ "\n",
"import baal.bayesian.dropout\n",
"\n",
"standard_model = torch.nn.Sequential(\n",
@@ -77,7 +94,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"The main difference between these is that the standard model will set the dropout probability to zero during eval, while the MC dropout model will not:"
]
@@ -85,7 +106,11 @@
{
"cell_type": "code",
"execution_count": 2,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stdout",
@@ -108,7 +133,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"In order to get a distribution of model outputs, you simply need to repeatedly run the same data through the MC Dropout model. `baal` makes this easier for you by providing a class called `ModelWrapper`. This class accepts your model and a criterion (loss) function, and provides several utility functions, such as running training steps and more. The one that is important for obtaining a posterior distribution is `Modelwrapper.predict_on_batch`.\n",
"\n",
@@ -118,7 +147,11 @@
{
"cell_type": "code",
"execution_count": 3,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from baal.modelwrapper import ModelWrapper\n",
@@ -134,7 +167,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"The tensor \"prediction_distribution\" has the shape (batch size) x (output size) x iterations:"
]
@@ -142,7 +179,11 @@
{
"cell_type": "code",
"execution_count": 4,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -161,7 +202,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"We can visualise this posterior distribution, for example for the first data point in our\n",
"minibatch (although note that because this model is overly simplistic, this is not very\n",
@@ -171,7 +216,11 @@
{
"cell_type": "code",
"execution_count": 5,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -188,7 +237,7 @@
],
"source": [
"import matplotlib.pyplot as plt\n",
- "%matplotlib inline\n",
+ "% matplotlib inline\n",
"\n",
"fig, ax = plt.subplots()\n",
"ax.hist(predictions[0, 0, :].numpy(), bins=50);\n",
@@ -197,7 +246,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## Drop Connect\n",
"\n",
@@ -209,7 +262,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"### Usage\n",
"As usual we have pre-implemented wrappers to ease your job for this. Example below shows how to use this module:"
@@ -218,10 +275,16 @@
{
"cell_type": "code",
"execution_count": 6,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"import torch\n",
+ "\n",
+ "\n",
"class DummyModel(torch.nn.Module):\n",
" def __init__(self):\n",
" super(DummyModel, self).__init__()\n",
@@ -242,7 +305,11 @@
{
"cell_type": "code",
"execution_count": 7,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"import numpy as np\n",
@@ -260,7 +327,11 @@
{
"cell_type": "code",
"execution_count": 8,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -279,7 +350,11 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"let's visualize the posterior:"
]
@@ -287,7 +362,11 @@
{
"cell_type": "code",
"execution_count": 9,
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -304,7 +383,7 @@
],
"source": [
"import matplotlib.pyplot as plt\n",
- "%matplotlib inline\n",
+ "% matplotlib inline\n",
"\n",
"fig, ax = plt.subplots()\n",
"ax.hist(predictions[0, 0, :].numpy(), bins=50);\n",
@@ -313,11 +392,23 @@
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
- "As part of our experiments, we run MCDropout(MCD) and DropConnect(MCDC) against eachother. We let the experiments run for 2000 epochs on `vgg16` using `CIFAR10` and tried different number of iterations and weight drop rate for Dropconnect. As the experiments show, `DropConnect` could give a better result if it is used with number of iterations more than `80` and drop weight rate of around `50%`. The reference [paper](https://arxiv.org/pdf/1906.04569.pdf) indicates having a drop rate of `94%` should give the best result but our experiments show otherwise. The main factor of change for DropConnect is the number of `iterations` used to estimate the posterior. However, as we can see for MCDropout, number of `iterations` 40 and 80 would give almost same results which would overfit by time. In order to prevent overfitting, we could change `learning rate` and use other techniques and get a lift on the performance, however as could be seen for higher `iterations`, DropConnect could easily outperform MCDropout at 10K training set size. \n",
- "Finally, the choice of method and training process is always for the user and depending on the problem in hand. Parameters like how low the validation error should be and if the training is allowed to be run for few days or there is a time limit could all effect in which strategy is best and which hyperparameters to choose.\n",
- "![MCD VS MCDC](https://github.com/ElementAI/baal/blob/master/docs/literature/images/experiment_results/iterations_mcdc.png?raw=true)"
+ "As part of our experiments, we compare MCDropout(MCD) and DropConnect(MCDC). We let the experiments run for 2000 epochs on `vgg16` using `CIFAR10` and tried different number of iterations and weight drop rate for Dropconnect.\n",
+ "Our experiments indicate that `DropConnect` could give a better result if it is used with number of iterations more than `80` and drop weight rate of around `50%`.\n",
+ "\n",
+ "The reference [paper](https://arxiv.org/pdf/1906.04569.pdf) indicates using a drop rate of `94%` give the best result but our experiments show otherwise.\n",
+ "The main factor of change for DropConnect is the number of `iterations` used to estimate the posterior. However, as we can see for MCDropout, number of `iterations` 40 and 80 would give almost the same results.\n",
+ " In order to prevent overfitting, we could change `learning rate` and use other techniques and get a lift on the performance, however as could be seen for higher `iterations`, DropConnect could easily outperform MCDropout at 10K training set size.\n",
+ "\n",
+ "Finally, the choice of method and training process is always up to the user and their current dataset.\n",
+ "Parameters like how low the validation error should be and if the training is allowed to be run for few days or there is a time limit could all effect in which strategy is best and which hyperparameters to choose.\n",
+ "\n",
+ " "
]
}
],
diff --git a/poetry.lock b/poetry.lock
index 970d6b6d..e41cbd4e 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -42,6 +42,17 @@ python-versions = ">=3.6"
[package.dependencies]
frozenlist = ">=1.1.0"
+[[package]]
+name = "astunparse"
+version = "1.6.3"
+description = "An AST unparser for Python"
+category = "dev"
+optional = false
+python-versions = "*"
+
+[package.dependencies]
+six = ">=1.6.1,<2.0"
+
[[package]]
name = "async-timeout"
version = "4.0.2"
@@ -485,6 +496,20 @@ requests-oauthlib = ">=0.7.0"
[package.extras]
tool = ["click (>=6.0.0)"]
+[[package]]
+name = "griffe"
+version = "0.21.0"
+description = "Signatures for entire Python programs. Extract the structure, the frame, the skeleton of your project, to generate API documentation or find breaking changes in your API."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+cached-property = {version = "*", markers = "python_version < \"3.8\""}
+
+[package.extras]
+async = ["aiofiles (>=0.7,<1.0)"]
+
[[package]]
name = "grpcio"
version = "1.43.0"
@@ -638,7 +663,7 @@ python-versions = ">=3.6"
[[package]]
name = "jsonargparse"
-version = "4.9.0"
+version = "4.10.0"
description = "Parsing of command line options, yaml/jsonnet config files and/or environment variables based on argparse."
category = "dev"
optional = false
@@ -649,9 +674,8 @@ docstring-parser = {version = ">=0.7.3", optional = true, markers = "extra == \"
PyYAML = ">=3.13"
[package.extras]
-all = ["docstring-parser (>=0.7.3)", "jsonschema (>=3.2.0)", "jsonnet (>=0.13.0)", "validators (>=0.14.2)", "requests (>=2.18.4)", "fsspec (>=0.8.4)", "argcomplete (>=2.0.0)", "ruyaml (>=0.20.0)", "omegaconf (>=2.1.1)", "reconplogger (>=4.4.0)", "typing-extensions (>=3.10.0.0)", "dataclasses (>=0.8)"]
+all = ["docstring-parser (>=0.7.3)", "jsonschema (>=3.2.0)", "jsonnet (>=0.13.0)", "validators (>=0.14.2)", "requests (>=2.18.4)", "fsspec (>=0.8.4)", "argcomplete (>=2.0.0)", "ruyaml (>=0.20.0)", "omegaconf (>=2.1.1)", "reconplogger (>=4.4.0)", "typing-extensions (>=3.10.0.0)"]
argcomplete = ["argcomplete (>=2.0.0)"]
-dataclasses = ["dataclasses (>=0.8)"]
dev = ["coverage (>=4.5.1)", "responses (>=0.12.0)", "Sphinx (>=1.7.9)", "sphinx-rtd-theme (>=0.4.3)", "autodocsumm (>=0.1.10)", "sphinx-autodoc-typehints (>=1.11.1)", "pre-commit (>=2.19.0)", "pylint (>=1.8.3)", "pycodestyle (>=2.5.0)", "mypy (>=0.701)", "tox (>=3.25.0)"]
doc = ["Sphinx (>=1.7.9)", "sphinx-rtd-theme (>=0.4.3)", "autodocsumm (>=0.1.10)", "sphinx-autodoc-typehints (>=1.11.1)"]
fsspec = ["fsspec (>=0.8.4)"]
@@ -978,6 +1002,18 @@ watchdog = ">=2.0"
[package.extras]
i18n = ["babel (>=2.9.0)"]
+[[package]]
+name = "mkdocs-autorefs"
+version = "0.4.1"
+description = "Automatically link across pages in MkDocs."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+Markdown = ">=3.3"
+mkdocs = ">=1.1"
+
[[package]]
name = "mkdocs-jupyter"
version = "0.21.0"
@@ -1017,6 +1053,53 @@ category = "dev"
optional = false
python-versions = ">=3.6"
+[[package]]
+name = "mkdocstrings"
+version = "0.18.1"
+description = "Automatic documentation from sources, for MkDocs."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+Jinja2 = ">=2.11.1"
+Markdown = ">=3.3"
+MarkupSafe = ">=1.1"
+mkdocs = ">=1.2"
+mkdocs-autorefs = ">=0.3.1"
+mkdocstrings-python = {version = ">=0.5.2", optional = true, markers = "extra == \"python\""}
+mkdocstrings-python-legacy = ">=0.2"
+pymdown-extensions = ">=6.3"
+
+[package.extras]
+crystal = ["mkdocstrings-crystal (>=0.3.4)"]
+python = ["mkdocstrings-python (>=0.5.2)"]
+python-legacy = ["mkdocstrings-python-legacy (>=0.2.1)"]
+
+[[package]]
+name = "mkdocstrings-python"
+version = "0.6.6"
+description = "A Python handler for mkdocstrings."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+griffe = ">=0.11.1"
+mkdocstrings = ">=0.18"
+
+[[package]]
+name = "mkdocstrings-python-legacy"
+version = "0.2.2"
+description = "A legacy Python handler for mkdocstrings."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+mkdocstrings = ">=0.18"
+pytkdocs = ">=0.14"
+
[[package]]
name = "multidict"
version = "5.2.0"
@@ -1466,6 +1549,22 @@ python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
[package.dependencies]
six = ">=1.5"
+[[package]]
+name = "pytkdocs"
+version = "0.16.1"
+description = "Load Python objects documentation."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+astunparse = {version = ">=1.6", markers = "python_version < \"3.9\""}
+cached-property = {version = ">=1.5", markers = "python_version < \"3.8\""}
+typing-extensions = {version = ">=3.7", markers = "python_version < \"3.8\""}
+
+[package.extras]
+numpy-style = ["docstring_parser (>=0.7)"]
+
[[package]]
name = "pytorch-lightning"
version = "1.5.8"
@@ -2067,7 +2166,7 @@ vision = ["torchvision"]
[metadata]
lock-version = "1.1"
python-versions = ">=3.7.1,<4"
-content-hash = "3cbd5faf3ba5cc490829f3baf57a1758df1562e395f7800edb6878397bf46471"
+content-hash = "8b209ee7cb8ac2fea451b849155766636f927f01e1a33399cedf3634bed5edfd"
[metadata.files]
absl-py = [
@@ -2152,6 +2251,10 @@ aiosignal = [
{file = "aiosignal-1.2.0-py3-none-any.whl", hash = "sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a"},
{file = "aiosignal-1.2.0.tar.gz", hash = "sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2"},
]
+astunparse = [
+ {file = "astunparse-1.6.3-py2.py3-none-any.whl", hash = "sha256:c2652417f2c8b5bb325c885ae329bdf3f86424075c4fd1a128674bc6fba4b8e8"},
+ {file = "astunparse-1.6.3.tar.gz", hash = "sha256:5ad93a8456f0d084c3456d059fd9a92cce667963232cbf763eac3bc5b7940872"},
+]
async-timeout = [
{file = "async-timeout-4.0.2.tar.gz", hash = "sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15"},
{file = "async_timeout-4.0.2-py3-none-any.whl", hash = "sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c"},
@@ -2448,6 +2551,10 @@ google-auth-oauthlib = [
{file = "google-auth-oauthlib-0.4.6.tar.gz", hash = "sha256:a90a072f6993f2c327067bf65270046384cda5a8ecb20b94ea9a687f1f233a7a"},
{file = "google_auth_oauthlib-0.4.6-py2.py3-none-any.whl", hash = "sha256:3f2a6e802eebbb6fb736a370fbf3b055edcb6b52878bf2f26330b5e041316c73"},
]
+griffe = [
+ {file = "griffe-0.21.0-py3-none-any.whl", hash = "sha256:e9fb5eeb7c721e1d84804452bdc742bd57b120b13aba663157668ae2d217088a"},
+ {file = "griffe-0.21.0.tar.gz", hash = "sha256:61ab3bc02b09afeb489f1aef44c646a09f1837d9cdf15943ac6021903a4d3984"},
+]
grpcio = [
{file = "grpcio-1.43.0-cp310-cp310-linux_armv7l.whl", hash = "sha256:a4e786a8ee8b30b25d70ee52cda6d1dbba2a8ca2f1208d8e20ed8280774f15c8"},
{file = "grpcio-1.43.0-cp310-cp310-macosx_10_10_universal2.whl", hash = "sha256:af9c3742f6c13575c0d4147a8454da0ff5308c4d9469462ff18402c6416942fe"},
@@ -2550,8 +2657,8 @@ joblib = [
{file = "joblib-1.1.0.tar.gz", hash = "sha256:4158fcecd13733f8be669be0683b96ebdbbd38d23559f54dca7205aea1bf1e35"},
]
jsonargparse = [
- {file = "jsonargparse-4.9.0-py3-none-any.whl", hash = "sha256:aecd494346c251dd34372239b9bafe46fc7d760f07dc548d6aac58176cf3fce2"},
- {file = "jsonargparse-4.9.0.tar.gz", hash = "sha256:4a2f4194796eb5d1a36179efdfc7e6bc383d9757b977192b4b2a6ea39d04b69d"},
+ {file = "jsonargparse-4.10.0-py3-none-any.whl", hash = "sha256:8042b5caa09b742fd963870f99050d43adee6668cd164ab437c419396564e45d"},
+ {file = "jsonargparse-4.10.0.tar.gz", hash = "sha256:db8d7adb3402c2269fa8b59e6f6a85d071bed0d0b6edea8cba23cf0ac19073f3"},
]
jsonschema = [
{file = "jsonschema-4.3.3-py3-none-any.whl", hash = "sha256:eb7a69801beb7325653aa8fd373abbf9ff8f85b536ab2812e5e8287b522fb6a2"},
@@ -2753,6 +2860,10 @@ mkdocs = [
{file = "mkdocs-1.3.0-py3-none-any.whl", hash = "sha256:26bd2b03d739ac57a3e6eed0b7bcc86168703b719c27b99ad6ca91dc439aacde"},
{file = "mkdocs-1.3.0.tar.gz", hash = "sha256:b504405b04da38795fec9b2e5e28f6aa3a73bb0960cb6d5d27ead28952bd35ea"},
]
+mkdocs-autorefs = [
+ {file = "mkdocs-autorefs-0.4.1.tar.gz", hash = "sha256:70748a7bd025f9ecd6d6feeba8ba63f8e891a1af55f48e366d6d6e78493aba84"},
+ {file = "mkdocs_autorefs-0.4.1-py3-none-any.whl", hash = "sha256:a2248a9501b29dc0cc8ba4c09f4f47ff121945f6ce33d760f145d6f89d313f5b"},
+]
mkdocs-jupyter = [
{file = "mkdocs-jupyter-0.21.0.tar.gz", hash = "sha256:c8c00ce44456e3cf50c5dc3fe0cb18fab6467fb5bafc2c0bfe1efff3e0a52470"},
]
@@ -2764,6 +2875,18 @@ mkdocs-material-extensions = [
{file = "mkdocs-material-extensions-1.0.3.tar.gz", hash = "sha256:bfd24dfdef7b41c312ede42648f9eb83476ea168ec163b613f9abd12bbfddba2"},
{file = "mkdocs_material_extensions-1.0.3-py3-none-any.whl", hash = "sha256:a82b70e533ce060b2a5d9eb2bc2e1be201cf61f901f93704b4acf6e3d5983a44"},
]
+mkdocstrings = [
+ {file = "mkdocstrings-0.18.1-py3-none-any.whl", hash = "sha256:4053929356df8cd69ed32eef71d8f676a472ef72980c9ffd4f933ead1debcdad"},
+ {file = "mkdocstrings-0.18.1.tar.gz", hash = "sha256:fb7c91ce7e3ab70488d3fa6c073a4f827cdc319042f682ef8ea95459790d64fc"},
+]
+mkdocstrings-python = [
+ {file = "mkdocstrings-python-0.6.6.tar.gz", hash = "sha256:37281696b9f199624ae420e0625b6659b7fdfbea736618bce7fd978682dea3b1"},
+ {file = "mkdocstrings_python-0.6.6-py3-none-any.whl", hash = "sha256:c118438d3cb4b14c492a51d109f4e5b27ab06ba19b099d624430dfd904926152"},
+]
+mkdocstrings-python-legacy = [
+ {file = "mkdocstrings-python-legacy-0.2.2.tar.gz", hash = "sha256:f0e7ec6a19750581b752acb38f6b32fcd1efe006f14f6703125d2c2c9a5c6f02"},
+ {file = "mkdocstrings_python_legacy-0.2.2-py3-none-any.whl", hash = "sha256:379107a3a5b8db9b462efc4493c122efe21e825e3702425dbd404621302a563a"},
+]
multidict = [
{file = "multidict-5.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3822c5894c72e3b35aae9909bef66ec83e44522faf767c0ad39e0e2de11d3b55"},
{file = "multidict-5.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:28e6d883acd8674887d7edc896b91751dc2d8e87fbdca8359591a13872799e4e"},
@@ -3194,6 +3317,10 @@ python-dateutil = [
{file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
{file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
]
+pytkdocs = [
+ {file = "pytkdocs-0.16.1-py3-none-any.whl", hash = "sha256:a8c3f46ecef0b92864cc598e9101e9c4cf832ebbf228f50c84aa5dd850aac379"},
+ {file = "pytkdocs-0.16.1.tar.gz", hash = "sha256:e2ccf6dfe9dbbceb09818673f040f1a7c32ed0bffb2d709b06be6453c4026045"},
+]
pytorch-lightning = [
{file = "pytorch-lightning-1.5.8.tar.gz", hash = "sha256:57e7c9ea3663e5d8416be6a340d3a8d9271eb105fba82b7f7497cf0299880939"},
{file = "pytorch_lightning-1.5.8-py3-none-any.whl", hash = "sha256:75783eb6c4d043d95691a341a81e7df2bcb02254779992f86c4276cf6ff3739b"},
diff --git a/pyproject.toml b/pyproject.toml
index 73a776a9..f270cfbe 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -49,8 +49,12 @@ docutils = "0.16"
# Lightning
lightning-flash = {git = "https://github.com/PyTorchLightning/lightning-flash.git", extras = ["image"]}
+
+# Documentation
mkdocs-jupyter = "^0.21.0"
mkdocs-material = "^8.3.6"
+Pygments = "^2.12.0"
+mkdocstrings = {extras = ["python"], version = "^0.18.1"}
[tool.poetry.extras]
vision = ["torchvision"]
From 5fbdcccfe5e4a978b224cc51072270b9025c084e Mon Sep 17 00:00:00 2001
From: Dref360
Date: Sat, 9 Jul 2022 16:46:06 -0400
Subject: [PATCH 03/10] Changes according to review
---
docs/index.md | 13 +++++++
docs/notebooks | 1 +
docs/research/index.md | 12 ++++++-
docs/research/literature/core-papers.md | 39 ---------------------
docs/research/literature/index.md | 45 ++++++++++++++++++++-----
docs/research/literature/more_papers.md | 12 -------
docs/tutorials/index.md | 8 ++---
docs/user_guide/heuristics.md | 34 +++++++++++++++++++
mkdocs.yml | 13 +++----
9 files changed, 107 insertions(+), 70 deletions(-)
create mode 120000 docs/notebooks
delete mode 100644 docs/research/literature/core-papers.md
delete mode 100644 docs/research/literature/more_papers.md
create mode 100644 docs/user_guide/heuristics.md
diff --git a/docs/index.md b/docs/index.md
index 5bcdac01..4fd4ca7a 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -9,6 +9,19 @@ To know more on what is Bayesian active learning, see our [User guide](user_guid
We are a member of Pytorch's ecosystem, and we welcome contributions from the community.
If you have any question, we are reachable on [Slack](https://join.slack.com/t/baal-world/shared_invite/zt-z0izhn4y-Jt6Zu5dZaV2rsAS9sdISfg).
+## Installation
+
+Baal is available as a package on PyPI:
+
+`pip install baal`
+
+??? "Additional dependencies for vision and NLP"
+
+ `baal[nlp]` installs needed dependencies for HuggingFace support.
+
+ `baal[vision]` installs dependencies for our Lightning-Flash integration.
+
+
## Support
For support, we have several ways to help you:
diff --git a/docs/notebooks b/docs/notebooks
new file mode 120000
index 00000000..8f9a5b2e
--- /dev/null
+++ b/docs/notebooks
@@ -0,0 +1 @@
+../notebooks
\ No newline at end of file
diff --git a/docs/research/index.md b/docs/research/index.md
index 64473875..86598ed8 100644
--- a/docs/research/index.md
+++ b/docs/research/index.md
@@ -1,2 +1,12 @@
-# Bayesian active learning research
+# Bayesian deep active learning research
+Research in this field is quite dynamic with multiple labs around the world working on this problem.
+
+In a nutshell, we want to:
+
+> Optimize labelling by maximizing the information obtained after each label.
+
+Another critical goal of our research is to better understand the sampling bias active learning creates.
+Recent research has shown that active learning creates more balanced, fairer datasets.
+
+We strongly suggest to go through our [literature review](./literature/index.md).
diff --git a/docs/research/literature/core-papers.md b/docs/research/literature/core-papers.md
deleted file mode 100644
index 0606e5fd..00000000
--- a/docs/research/literature/core-papers.md
+++ /dev/null
@@ -1,39 +0,0 @@
-# The theory behind Bayesian active learning
-
-In this document, we keep a list of the papers to get you started in Bayesian deep learning and Bayesian active learning.
-
-We hope to include a summary for each of then in the future, but for now we have this list with some notes.
-
-
-### How to estimate uncertainty in Deep Learning networks
-
-* [Excellent tutorial from AGW on Bayesian Deep Learning](https://icml.cc/virtual/2020/tutorial/5750)
- * This is inspired by his publication [Bayesian Deep Learning and a Probabilistic Perspective of Generalization](https://arxiv.org/abs/2002.08791)
-* [Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning](https://arxiv.org/pdf/1506.02142.pdf) (Gal and Ghahramani, 2016)
- * This describes Monte-Carlo Dropout, a way to estimate uncertainty through stochastic dropout at test time
-* [Bayesian Uncertainty Estimation for Batch Normalized Deep Networks](https://arxiv.org/abs/1802.06455) (Teye et al. 2018)
- * This describes Monte-Carlo BatchNorm, a way to estimate uncertainty through random batch norm parameters at test time
-* [Bayesian Deep Learning and a Probabilistic Perspective of Generalization](https://arxiv.org/abs/2002.08791) (Gordon Wilson and Izmailov, 2020)
- * Presentation of multi-SWAG a mix between VI and Ensembles.
-* [Advances in Variational inference](https://arxiv.org/pdf/1711.05597.pdf) (Zhang et al, 2018)
- * Gives a quick introduction to VI and the most recent advances.
-* [A Simple Baseline for Bayesian Uncertainty in Deep Learning](https://arxiv.org/abs/1902.02476) (Maddox et al. 2019)
- * Presents SWAG, an easy way to create ensembles.
-
-
-
-
-### Bayesian active learning
-* [Deep Bayesian Active Learning with Image Data](https://arxiv.org/pdf/1703.02910.pdf) (Gal and Islam and Ghahramani, 2017)
- * Fundamental paper on how to do Bayesian active learning. A must read.
-* [Sampling bias in active learning](http://cseweb.ucsd.edu/~dasgupta/papers/twoface.pdf) (Dasgupta 2009)
- * Presents sampling bias and how to solve it by combining heuristics and random selection.
-
-* [Bayesian Active Learning for Classification and Preference Learning](https://arxiv.org/pdf/1112.5745.pdf) (Houlsby et al. 2011)
- * Fundamental paper on one of the main heuristic BALD.
-
-
-### Bayesian active learning on NLP
-
-* [Deep Bayesian Active Learning for Natural Language Processing: Results of a Large-Scale Empirical Study](https://arxiv.org/abs/1808.05697) (Siddhant and Lipton, 2018)
- * Experimental paper on how to use Bayesian active learning on NLP tasks.
diff --git a/docs/research/literature/index.md b/docs/research/literature/index.md
index e68741fc..02f7dc58 100644
--- a/docs/research/literature/index.md
+++ b/docs/research/literature/index.md
@@ -7,13 +7,42 @@ If you've read a paper recently, write a little summary in markdown, put it in
the folder `docs/research/literature` and make a pull request. You can even do all of
that right in the Github web UI!
-```eval_rst
-.. toctree::
- :caption: Literature review
- :maxdepth: 1
- :glob:
+## The theory behind Bayesian active learning
- *
-```
+In this document, we keep a list of the papers to get you started in Bayesian deep learning and Bayesian active learning.
----
\ No newline at end of file
+We hope to include a summary for each of then in the future, but for now we have this list with some notes.
+
+
+### How to estimate uncertainty in Deep Learning networks
+
+* [Excellent tutorial from AGW on Bayesian Deep Learning](https://icml.cc/virtual/2020/tutorial/5750)
+ * This is inspired by his publication [Bayesian Deep Learning and a Probabilistic Perspective of Generalization](https://arxiv.org/abs/2002.08791)
+* [Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning](https://arxiv.org/pdf/1506.02142.pdf) (Gal and Ghahramani, 2016)
+ * This describes Monte-Carlo Dropout, a way to estimate uncertainty through stochastic dropout at test time
+* [Bayesian Uncertainty Estimation for Batch Normalized Deep Networks](https://arxiv.org/abs/1802.06455) (Teye et al. 2018)
+ * This describes Monte-Carlo BatchNorm, a way to estimate uncertainty through random batch norm parameters at test time
+* [Bayesian Deep Learning and a Probabilistic Perspective of Generalization](https://arxiv.org/abs/2002.08791) (Gordon Wilson and Izmailov, 2020)
+ * Presentation of multi-SWAG a mix between VI and Ensembles.
+* [Advances in Variational inference](https://arxiv.org/pdf/1711.05597.pdf) (Zhang et al, 2018)
+ * Gives a quick introduction to VI and the most recent advances.
+* [A Simple Baseline for Bayesian Uncertainty in Deep Learning](https://arxiv.org/abs/1902.02476) (Maddox et al. 2019)
+ * Presents SWAG, an easy way to create ensembles.
+
+
+
+
+### Bayesian active learning
+* [Deep Bayesian Active Learning with Image Data](https://arxiv.org/pdf/1703.02910.pdf) (Gal and Islam and Ghahramani, 2017)
+ * Fundamental paper on how to do Bayesian active learning. A must read.
+* [Sampling bias in active learning](http://cseweb.ucsd.edu/~dasgupta/papers/twoface.pdf) (Dasgupta 2009)
+ * Presents sampling bias and how to solve it by combining heuristics and random selection.
+
+* [Bayesian Active Learning for Classification and Preference Learning](https://arxiv.org/pdf/1112.5745.pdf) (Houlsby et al. 2011)
+ * Fundamental paper on one of the main heuristic BALD.
+
+
+### Bayesian active learning on NLP
+
+* [Deep Bayesian Active Learning for Natural Language Processing: Results of a Large-Scale Empirical Study](https://arxiv.org/abs/1808.05697) (Siddhant and Lipton, 2018)
+ * Experimental paper on how to use Bayesian active learning on NLP tasks.
diff --git a/docs/research/literature/more_papers.md b/docs/research/literature/more_papers.md
deleted file mode 100644
index 0326ab1a..00000000
--- a/docs/research/literature/more_papers.md
+++ /dev/null
@@ -1,12 +0,0 @@
-## Additional papers that are interesting
-
-In this section, we put additional papers that can be interesting.
-
-```eval_rst
-.. toctree::
- :maxdepth: 1
- :caption: Additional papers
- :glob:
-
- Additional papers/*
-```
\ No newline at end of file
diff --git a/docs/tutorials/index.md b/docs/tutorials/index.md
index 82e70532..a8711f11 100644
--- a/docs/tutorials/index.md
+++ b/docs/tutorials/index.md
@@ -1,13 +1,13 @@
# Tutorials
Tutorials are split in two sections, "How-to" and "Compatibility". The first one focuses on Baal's capabilities and the
-latter on how we integrate with other common frameworks such as [Label Studio], [HuggingFace] or [Lightning Flash].
+latter on how we integrate with other common frameworks such as Label Studio, HuggingFace or Lightning Flash.
## :material-file-tree: How to
-* Run an active learning experiments
-* Active learning in production
-* Deep Ensembles
+* [Run an active learning experiments](notebooks/active_learning_process.ipynb)
+* [Active learning in production](notebooks/baal_prod_cls.ipynb)
+* [Deep Ensembles](../notebooks/deep_ensemble.ipynb)
## :material-file-tree: Compatibility
diff --git a/docs/user_guide/heuristics.md b/docs/user_guide/heuristics.md
new file mode 100644
index 00000000..019f7ecb
--- /dev/null
+++ b/docs/user_guide/heuristics.md
@@ -0,0 +1,34 @@
+# Active learning heuristics
+
+**Heuristics** take a set of predictions and outputs the order in which they should be labelled.
+
+A simple heuristic would be to prioritize items where the model had low confidence.
+We will cover the two main heuristics: **Entropy** and **BALD**.
+
+
+### Entropy
+
+The goal of this heuristic is to maximize information. To do so, we will compute the entropy of each prediction before ordering them.
+
+Let $p_c(x)$ be the probability of input $x$ to be from class $c$. The entropy can be computed as:
+
+$$
+H(x) = \sum_c^C p_c(x)
+$$
+
+This score reflects the informativeness of knowing the true label of $x$.
+Naturally the next item to label would be $argmax_{x \in {\cal D}} H(x)$, where ${\cal D} is our dataset$
+
+A drawback of this method is that it doesn't differentiate between *aleatoric* uncertainty and *epistemic* uncertainty.
+To do so, we will use BALD
+
+### BALD
+
+Bayesian active learning by disagrement or BALD (Houslby et al. 2013) is the basis of most modern active learning heuristics.
+
+From a Bayesian model $f$, we draw $I$ predictions per sample $x$.
+
+Then, we want to maximize the mutual information between a prediction and the model's parameters. This is done by looking at how the predictions are disagreeing with each others.
+If the prediction "flips" often, it means that the item is close to a decision boundary and thus hard to fit.
+
+ ${\cal I}[y, \theta \mid x, {\cal D}] = {\cal H}[y \mid x, {\cal D}] - {\cal E}_{p(\theta \mid {\cal D})}[{\cal H}[y \mid x, \theta]]$
diff --git a/mkdocs.yml b/mkdocs.yml
index 8073c11d..5db0fab7 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -41,6 +41,8 @@ markdown_extensions:
emoji_generator: !!python/name:materialx.emoji.to_svg
- pymdownx.highlight:
anchor_linenums: true
+ - admonition
+ - pymdownx.details
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.superfences
@@ -57,7 +59,7 @@ nav:
- Cheat Sheet: user_guide/baal_cheatsheet.md
- Active data structure: notebooks/fundamentals/active-learning.ipynb
- Computing uncertainty:
- - Stochastic models: notebooks/fundamentals/active-learning.ipynb
+ - Stochastic models: notebooks/fundamentals/posteriors.ipynb
- Heuristics: user_guide/heuristics.md
- API:
- api/index.md
@@ -77,19 +79,18 @@ nav:
- tutorials/label-studio.md
- notebooks/compatibility/nlp_classification.ipynb
- notebooks/compatibility/sklearn_tutorial.ipynb
- - notebooks/baal_prod_cls.ipynb
- - notebooks/deep_ensemble.ipynb
+ - Active learning for research: notebooks/active_learning_process.ipynb
+ - Active learning for production: notebooks/baal_prod_cls.ipynb
+ - Deep Ensembles for active learning: notebooks/deep_ensemble.ipynb
- Research:
- research/index.md
- Technical Reports:
- - notebooks/fairness/ActiveFairness.ipynb
+ - Active Fairness: notebooks/fairness/ActiveFairness.ipynb
- research/dirichlet_calibration.md
- research/double_descent.md
- Literature:
- research/literature/index.md
- - research/literature/core-papers.md
- Additional papers:
- - research/literature/more_papers.md
- research/literature/Additional papers/dmi.md
- research/literature/Additional papers/duq.md
- research/literature/Additional papers/gyolov3.md
From b1e6937127bb6c195a1ceed833de5e8851625153 Mon Sep 17 00:00:00 2001
From: Dref360
Date: Thu, 14 Jul 2022 16:23:40 -0400
Subject: [PATCH 04/10] Merge master
---
poetry.lock | 1963 ++++++++++-----------------------------------------
1 file changed, 372 insertions(+), 1591 deletions(-)
diff --git a/poetry.lock b/poetry.lock
index c77306c1..4829b32c 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -40,63 +40,15 @@ python-versions = ">=3.6"
frozenlist = ">=1.1.0"
[[package]]
-name = "alabaster"
-version = "0.7.12"
-description = "A configurable sidebar-enabled Sphinx theme"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[[package]]
-name = "appnope"
-version = "0.1.3"
-description = "Disable App Nap on macOS >= 10.9"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[[package]]
-name = "argon2-cffi"
-version = "21.3.0"
-description = "The secure Argon2 password hashing algorithm."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-argon2-cffi-bindings = "*"
-typing-extensions = {version = "*", markers = "python_version < \"3.8\""}
-
-[package.extras]
-dev = ["pre-commit", "cogapp", "tomli", "coverage[toml] (>=5.0.2)", "hypothesis", "pytest", "sphinx", "sphinx-notfound-page", "furo"]
-docs = ["sphinx", "sphinx-notfound-page", "furo"]
-tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pytest"]
-
-[[package]]
-name = "argon2-cffi-bindings"
-version = "21.2.0"
-description = "Low-level CFFI bindings for Argon2"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-cffi = ">=1.0.1"
-
-[package.extras]
-dev = ["pytest", "cogapp", "pre-commit", "wheel"]
-tests = ["pytest"]
-
-[[package]]
-name = "asteroid-sphinx-theme"
-version = "0.0.3"
-description = "Asteroid: Sphinx Theme"
+name = "astunparse"
+version = "1.6.3"
+description = "An AST unparser for Python"
category = "dev"
optional = false
python-versions = "*"
[package.dependencies]
-sphinx = "*"
+six = ">=1.6.1,<2.0"
[[package]]
name = "async-timeout"
@@ -139,25 +91,6 @@ docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface", "cloudpickle"]
tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "cloudpickle"]
-[[package]]
-name = "babel"
-version = "2.10.3"
-description = "Internationalization utilities"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-pytz = ">=2015.7"
-
-[[package]]
-name = "backcall"
-version = "0.2.0"
-description = "Specifications for callback functions passed in to an API"
-category = "dev"
-optional = false
-python-versions = "*"
-
[[package]]
name = "bandit"
version = "1.7.4"
@@ -194,25 +127,29 @@ lxml = ["lxml"]
[[package]]
name = "black"
-version = "22.6.0"
+version = "21.12b0"
description = "The uncompromising code formatter."
category = "dev"
optional = false
python-versions = ">=3.6.2"
[package.dependencies]
-click = ">=8.0.0"
+click = ">=7.1.2"
mypy-extensions = ">=0.4.3"
-pathspec = ">=0.9.0"
+pathspec = ">=0.9.0,<1"
platformdirs = ">=2"
-tomli = {version = ">=1.1.0", markers = "python_full_version < \"3.11.0a7\""}
+tomli = ">=0.2.6,<2.0.0"
typed-ast = {version = ">=1.4.2", markers = "python_version < \"3.8\" and implementation_name == \"cpython\""}
-typing-extensions = {version = ">=3.10.0.0", markers = "python_version < \"3.10\""}
+typing-extensions = [
+ {version = ">=3.10.0.0", markers = "python_version < \"3.10\""},
+ {version = "!=3.10.0.1", markers = "python_version >= \"3.10\""},
+]
[package.extras]
colorama = ["colorama (>=0.4.3)"]
d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
+python2 = ["typed-ast (>=1.4.3)"]
uvloop = ["uvloop (>=0.15.2)"]
[[package]]
@@ -231,6 +168,14 @@ webencodings = "*"
css = ["tinycss2 (>=1.1.0,<1.2)"]
dev = ["build (==0.8.0)", "flake8 (==4.0.1)", "hashin (==0.17.0)", "pip-tools (==6.6.2)", "pytest (==7.1.2)", "Sphinx (==4.3.2)", "tox (==3.25.0)", "twine (==4.0.1)", "wheel (==0.37.1)", "black (==22.3.0)", "mypy (==0.961)"]
+[[package]]
+name = "cached-property"
+version = "1.5.2"
+description = "A decorator for caching properties in classes."
+category = "dev"
+optional = false
+python-versions = "*"
+
[[package]]
name = "cachetools"
version = "5.2.0"
@@ -289,17 +234,6 @@ category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
-[[package]]
-name = "commonmark"
-version = "0.9.1"
-description = "Python parser for the CommonMark Markdown spec"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.extras]
-test = ["flake8 (==3.7.8)", "hypothesis (==3.55.3)"]
-
[[package]]
name = "coverage"
version = "6.4.1"
@@ -347,32 +281,16 @@ xxhash = "*"
apache-beam = ["apache-beam (>=2.26.0)"]
audio = ["librosa"]
benchmarks = ["numpy (==1.18.5)", "tensorflow (==2.3.0)", "torch (==1.6.0)", "transformers (==3.0.2)"]
-dev = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[s3,server] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "black (>=22.0,<23.0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)", "importlib-resources"]
+dev = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[server,s3] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "black (>=22.0,<23.0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)", "importlib-resources"]
docs = ["docutils (==0.16.0)", "recommonmark", "sphinx (==3.1.2)", "sphinx-markdown-tables", "sphinx-rtd-theme (==0.4.3)", "sphinxext-opengraph (==0.4.1)", "sphinx-copybutton", "fsspec (<2021.9.0)", "s3fs", "sphinx-panels", "sphinx-inline-tabs", "myst-parser", "Markdown (!=3.3.5)"]
quality = ["black (>=22.0,<23.0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)"]
s3 = ["fsspec", "boto3", "botocore", "s3fs"]
tensorflow = ["tensorflow (>=2.2.0,!=2.6.0,!=2.6.1)"]
tensorflow_gpu = ["tensorflow-gpu (>=2.2.0,!=2.6.0,!=2.6.1)"]
-tests = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[s3,server] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "importlib-resources"]
+tests = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[server,s3] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "importlib-resources"]
torch = ["torch"]
vision = ["Pillow (>=6.2.1)"]
-[[package]]
-name = "debugpy"
-version = "1.6.0"
-description = "An implementation of the Debug Adapter Protocol for Python"
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[[package]]
-name = "decorator"
-version = "5.1.1"
-description = "Decorators for Humans"
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
[[package]]
name = "defusedxml"
version = "0.7.1"
@@ -518,6 +436,20 @@ smb = ["smbprotocol"]
ssh = ["paramiko"]
tqdm = ["tqdm"]
+[[package]]
+name = "ghp-import"
+version = "2.1.0"
+description = "Copy your docs directly to the gh-pages branch."
+category = "dev"
+optional = false
+python-versions = "*"
+
+[package.dependencies]
+python-dateutil = ">=2.8.1"
+
+[package.extras]
+dev = ["twine", "markdown", "flake8", "wheel"]
+
[[package]]
name = "gitdb"
version = "4.0.9"
@@ -576,6 +508,20 @@ requests-oauthlib = ">=0.7.0"
[package.extras]
tool = ["click (>=6.0.0)"]
+[[package]]
+name = "griffe"
+version = "0.22.0"
+description = "Signatures for entire Python programs. Extract the structure, the frame, the skeleton of your project, to generate API documentation or find breaking changes in your API."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+cached-property = {version = "*", markers = "python_version < \"3.8\""}
+
+[package.extras]
+async = ["aiofiles (>=0.7,<1.0)"]
+
[[package]]
name = "grpcio"
version = "1.47.0"
@@ -657,14 +603,6 @@ category = "main"
optional = false
python-versions = ">=3.5"
-[[package]]
-name = "imagesize"
-version = "1.4.1"
-description = "Getting image size from png/jpeg/jpeg2000/gif file"
-category = "dev"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-
[[package]]
name = "importlib-metadata"
version = "4.12.0"
@@ -705,104 +643,6 @@ category = "dev"
optional = false
python-versions = "*"
-[[package]]
-name = "ipykernel"
-version = "6.15.0"
-description = "IPython Kernel for Jupyter"
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[package.dependencies]
-appnope = {version = "*", markers = "platform_system == \"Darwin\""}
-debugpy = ">=1.0"
-ipython = ">=7.23.1"
-jupyter-client = ">=6.1.12"
-matplotlib-inline = ">=0.1"
-nest-asyncio = "*"
-packaging = "*"
-psutil = "*"
-pyzmq = ">=17"
-tornado = ">=6.1"
-traitlets = ">=5.1.0"
-
-[package.extras]
-test = ["flaky", "ipyparallel", "pre-commit", "pytest-cov", "pytest-timeout", "pytest (>=6.0)"]
-
-[[package]]
-name = "ipython"
-version = "7.34.0"
-description = "IPython: Productive Interactive Computing"
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[package.dependencies]
-appnope = {version = "*", markers = "sys_platform == \"darwin\""}
-backcall = "*"
-colorama = {version = "*", markers = "sys_platform == \"win32\""}
-decorator = "*"
-jedi = ">=0.16"
-matplotlib-inline = "*"
-pexpect = {version = ">4.3", markers = "sys_platform != \"win32\""}
-pickleshare = "*"
-prompt-toolkit = ">=2.0.0,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.1.0"
-pygments = "*"
-traitlets = ">=4.2"
-
-[package.extras]
-all = ["Sphinx (>=1.3)", "ipykernel", "ipyparallel", "ipywidgets", "nbconvert", "nbformat", "nose (>=0.10.1)", "notebook", "numpy (>=1.17)", "pygments", "qtconsole", "requests", "testpath"]
-doc = ["Sphinx (>=1.3)"]
-kernel = ["ipykernel"]
-nbconvert = ["nbconvert"]
-nbformat = ["nbformat"]
-notebook = ["notebook", "ipywidgets"]
-parallel = ["ipyparallel"]
-qtconsole = ["qtconsole"]
-test = ["nose (>=0.10.1)", "requests", "testpath", "pygments", "nbformat", "ipykernel", "numpy (>=1.17)"]
-
-[[package]]
-name = "ipython-genutils"
-version = "0.2.0"
-description = "Vestigial utilities from IPython"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[[package]]
-name = "ipywidgets"
-version = "7.7.1"
-description = "IPython HTML widgets for Jupyter"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-ipykernel = ">=4.5.1"
-ipython = {version = ">=4.0.0", markers = "python_version >= \"3.3\""}
-ipython-genutils = ">=0.2.0,<0.3.0"
-jupyterlab-widgets = {version = ">=1.0.0", markers = "python_version >= \"3.6\""}
-traitlets = ">=4.3.1"
-widgetsnbextension = ">=3.6.0,<3.7.0"
-
-[package.extras]
-test = ["pytest (>=3.6.0)", "pytest-cov", "mock"]
-
-[[package]]
-name = "jedi"
-version = "0.18.1"
-description = "An autocompletion tool for Python that can be used for text editors."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-parso = ">=0.8.0,<0.9.0"
-
-[package.extras]
-qa = ["flake8 (==3.8.3)", "mypy (==0.782)"]
-testing = ["Django (<3.1)", "colorama", "docopt", "pytest (<7.0.0)"]
-
[[package]]
name = "jinja2"
version = "3.1.2"
@@ -827,7 +667,7 @@ python-versions = ">=3.6"
[[package]]
name = "jsonargparse"
-version = "4.10.2"
+version = "4.11.0"
description = "Parsing of command line options, yaml/jsonnet config files and/or environment variables based on argparse."
category = "main"
optional = true
@@ -910,21 +750,6 @@ traitlets = "*"
[package.extras]
test = ["ipykernel", "pre-commit", "pytest", "pytest-cov", "pytest-timeout"]
-[[package]]
-name = "jupyter-sphinx"
-version = "0.3.2"
-description = "Jupyter Sphinx Extensions"
-category = "dev"
-optional = false
-python-versions = ">= 3.6"
-
-[package.dependencies]
-IPython = "*"
-ipywidgets = ">=7.0.0"
-nbconvert = ">=5.5"
-nbformat = "*"
-Sphinx = ">=2"
-
[[package]]
name = "jupyterlab-pygments"
version = "0.2.2"
@@ -934,12 +759,23 @@ optional = false
python-versions = ">=3.7"
[[package]]
-name = "jupyterlab-widgets"
-version = "1.1.1"
-description = "A JupyterLab extension."
+name = "jupytext"
+version = "1.14.0"
+description = "Jupyter notebooks as Markdown documents, Julia, Python or R scripts"
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = "~=3.6"
+
+[package.dependencies]
+markdown-it-py = ">=1.0.0,<3.0.0"
+mdit-py-plugins = "*"
+nbformat = "*"
+pyyaml = "*"
+toml = "*"
+
+[package.extras]
+rst2md = ["sphinx-gallery (>=0.7.0,<0.8.0)"]
+toml = ["toml"]
[[package]]
name = "kiwisolver"
@@ -996,7 +832,7 @@ name = "markdown"
version = "3.3.7"
description = "Python implementation of Markdown."
category = "main"
-optional = true
+optional = false
python-versions = ">=3.6"
[package.dependencies]
@@ -1005,6 +841,28 @@ importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""}
[package.extras]
testing = ["coverage", "pyyaml"]
+[[package]]
+name = "markdown-it-py"
+version = "2.1.0"
+description = "Python port of markdown-it. Markdown parsing, done right!"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+mdurl = ">=0.1,<1.0"
+typing_extensions = {version = ">=3.7.4", markers = "python_version < \"3.8\""}
+
+[package.extras]
+benchmarking = ["psutil", "pytest", "pytest-benchmark (>=3.2,<4.0)"]
+code_style = ["pre-commit (==2.6)"]
+compare = ["commonmark (>=0.9.1,<0.10.0)", "markdown (>=3.3.6,<3.4.0)", "mistletoe (>=0.8.1,<0.9.0)", "mistune (>=2.0.2,<2.1.0)", "panflute (>=2.1.3,<2.2.0)"]
+linkify = ["linkify-it-py (>=1.0,<2.0)"]
+plugins = ["mdit-py-plugins"]
+profiling = ["gprof2dot"]
+rtd = ["attrs", "myst-parser", "pyyaml", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinx-book-theme"]
+testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
+
[[package]]
name = "markupsafe"
version = "2.1.1"
@@ -1033,23 +891,44 @@ python-dateutil = ">=2.7"
setuptools_scm = ">=4"
[[package]]
-name = "matplotlib-inline"
-version = "0.1.3"
-description = "Inline Matplotlib backend for Jupyter"
+name = "mccabe"
+version = "0.6.1"
+description = "McCabe checker, plugin for flake8"
category = "dev"
optional = false
-python-versions = ">=3.5"
+python-versions = "*"
+
+[[package]]
+name = "mdit-py-plugins"
+version = "0.3.0"
+description = "Collection of plugins for markdown-it-py"
+category = "dev"
+optional = false
+python-versions = "~=3.6"
[package.dependencies]
-traitlets = "*"
+markdown-it-py = ">=1.0.0,<3.0.0"
+
+[package.extras]
+code_style = ["pre-commit (==2.6)"]
+rtd = ["myst-parser (>=0.14.0,<0.15.0)", "sphinx-book-theme (>=0.1.0,<0.2.0)"]
+testing = ["coverage", "pytest (>=3.6,<4)", "pytest-cov", "pytest-regressions"]
[[package]]
-name = "mccabe"
-version = "0.6.1"
-description = "McCabe checker, plugin for flake8"
+name = "mdurl"
+version = "0.1.1"
+description = "Markdown URL utilities"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.7"
+
+[[package]]
+name = "mergedeep"
+version = "1.3.4"
+description = "A deep merge function for 🐍."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
[[package]]
name = "mistune"
@@ -1059,6 +938,127 @@ category = "dev"
optional = false
python-versions = "*"
+[[package]]
+name = "mkdocs"
+version = "1.3.0"
+description = "Project documentation with Markdown."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+
+[package.dependencies]
+click = ">=3.3"
+ghp-import = ">=1.0"
+importlib-metadata = ">=4.3"
+Jinja2 = ">=2.10.2"
+Markdown = ">=3.2.1"
+mergedeep = ">=1.3.4"
+packaging = ">=20.5"
+PyYAML = ">=3.10"
+pyyaml-env-tag = ">=0.1"
+watchdog = ">=2.0"
+
+[package.extras]
+i18n = ["babel (>=2.9.0)"]
+
+[[package]]
+name = "mkdocs-autorefs"
+version = "0.4.1"
+description = "Automatically link across pages in MkDocs."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+Markdown = ">=3.3"
+mkdocs = ">=1.1"
+
+[[package]]
+name = "mkdocs-jupyter"
+version = "0.21.0"
+description = "Use Jupyter in mkdocs websites"
+category = "dev"
+optional = false
+python-versions = ">=3.7.1,<4"
+
+[package.dependencies]
+jupytext = ">=1.13.8,<2.0.0"
+mkdocs = ">=1.2.3,<2.0.0"
+mkdocs-material = ">=8.0.0,<9.0.0"
+nbconvert = ">=6.2.0,<7.0.0"
+Pygments = ">=2.12.0,<3.0.0"
+
+[[package]]
+name = "mkdocs-material"
+version = "8.3.9"
+description = "Documentation that simply works"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+jinja2 = ">=3.0.2"
+markdown = ">=3.2"
+mkdocs = ">=1.3.0"
+mkdocs-material-extensions = ">=1.0.3"
+pygments = ">=2.12"
+pymdown-extensions = ">=9.4"
+
+[[package]]
+name = "mkdocs-material-extensions"
+version = "1.0.3"
+description = "Extension pack for Python Markdown."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+
+[[package]]
+name = "mkdocstrings"
+version = "0.18.1"
+description = "Automatic documentation from sources, for MkDocs."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+Jinja2 = ">=2.11.1"
+Markdown = ">=3.3"
+MarkupSafe = ">=1.1"
+mkdocs = ">=1.2"
+mkdocs-autorefs = ">=0.3.1"
+mkdocstrings-python = {version = ">=0.5.2", optional = true, markers = "extra == \"python\""}
+mkdocstrings-python-legacy = ">=0.2"
+pymdown-extensions = ">=6.3"
+
+[package.extras]
+crystal = ["mkdocstrings-crystal (>=0.3.4)"]
+python = ["mkdocstrings-python (>=0.5.2)"]
+python-legacy = ["mkdocstrings-python-legacy (>=0.2.1)"]
+
+[[package]]
+name = "mkdocstrings-python"
+version = "0.6.6"
+description = "A Python handler for mkdocstrings."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+griffe = ">=0.11.1"
+mkdocstrings = ">=0.18"
+
+[[package]]
+name = "mkdocstrings-python-legacy"
+version = "0.2.2"
+description = "A legacy Python handler for mkdocstrings."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+mkdocstrings = ">=0.18"
+pytkdocs = ">=0.14"
+
[[package]]
name = "multidict"
version = "6.0.2"
@@ -1172,22 +1172,6 @@ traitlets = ">=5.1"
[package.extras]
test = ["check-manifest", "testpath", "pytest", "pre-commit"]
-[[package]]
-name = "nbsphinx"
-version = "0.8.9"
-description = "Jupyter Notebook Tools for Sphinx"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-docutils = "*"
-jinja2 = "*"
-nbconvert = "!=5.4"
-nbformat = "*"
-sphinx = ">=1.8"
-traitlets = ">=5"
-
[[package]]
name = "nest-asyncio"
version = "1.5.5"
@@ -1196,36 +1180,6 @@ category = "dev"
optional = false
python-versions = ">=3.5"
-[[package]]
-name = "notebook"
-version = "6.4.12"
-description = "A web-based notebook environment for interactive computing"
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[package.dependencies]
-argon2-cffi = "*"
-ipykernel = "*"
-ipython-genutils = "*"
-jinja2 = "*"
-jupyter-client = ">=5.3.4"
-jupyter-core = ">=4.6.1"
-nbconvert = ">=5"
-nbformat = "*"
-nest-asyncio = ">=1.5"
-prometheus-client = "*"
-pyzmq = ">=17"
-Send2Trash = ">=1.8.0"
-terminado = ">=0.8.3"
-tornado = ">=6.1"
-traitlets = ">=4.2.1"
-
-[package.extras]
-docs = ["sphinx", "nbsphinx", "sphinxcontrib-github-alt", "sphinx-rtd-theme", "myst-parser"]
-json-logging = ["json-logging"]
-test = ["pytest", "coverage", "requests", "testpath", "nbval", "selenium", "pytest-cov", "requests-unixsocket"]
-
[[package]]
name = "numpy"
version = "1.21.6"
@@ -1234,21 +1188,6 @@ category = "main"
optional = false
python-versions = ">=3.7,<3.11"
-[[package]]
-name = "numpydoc"
-version = "1.4.0"
-description = "Sphinx extension to support docstrings in Numpy format"
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[package.dependencies]
-Jinja2 = ">=2.10"
-sphinx = ">=3.0"
-
-[package.extras]
-testing = ["pytest", "pytest-cov", "matplotlib"]
-
[[package]]
name = "oauthlib"
version = "3.2.0"
@@ -1297,18 +1236,6 @@ category = "dev"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-[[package]]
-name = "parso"
-version = "0.8.3"
-description = "A Python Parser"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.extras]
-qa = ["flake8 (==3.8.3)", "mypy (==0.782)"]
-testing = ["docopt", "pytest (<6.0.0)"]
-
[[package]]
name = "pathspec"
version = "0.9.0"
@@ -1325,25 +1252,6 @@ category = "dev"
optional = false
python-versions = ">=2.6"
-[[package]]
-name = "pexpect"
-version = "4.8.0"
-description = "Pexpect allows easy control of interactive console applications."
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-ptyprocess = ">=0.5"
-
-[[package]]
-name = "pickleshare"
-version = "0.7.5"
-description = "Tiny 'shelve'-like database with concurrency support"
-category = "dev"
-optional = false
-python-versions = "*"
-
[[package]]
name = "pillow"
version = "9.2.0"
@@ -1383,28 +1291,6 @@ importlib-metadata = {version = ">=0.12", markers = "python_version < \"3.8\""}
dev = ["pre-commit", "tox"]
testing = ["pytest", "pytest-benchmark"]
-[[package]]
-name = "prometheus-client"
-version = "0.14.1"
-description = "Python client for the Prometheus monitoring system."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.extras]
-twisted = ["twisted"]
-
-[[package]]
-name = "prompt-toolkit"
-version = "3.0.30"
-description = "Library for building powerful interactive command lines in Python"
-category = "dev"
-optional = false
-python-versions = ">=3.6.2"
-
-[package.dependencies]
-wcwidth = "*"
-
[[package]]
name = "protobuf"
version = "3.19.4"
@@ -1413,25 +1299,6 @@ category = "main"
optional = true
python-versions = ">=3.5"
-[[package]]
-name = "psutil"
-version = "5.9.1"
-description = "Cross-platform lib for process and system monitoring in Python."
-category = "dev"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-
-[package.extras]
-test = ["ipaddress", "mock", "enum34", "pywin32", "wmi"]
-
-[[package]]
-name = "ptyprocess"
-version = "0.7.0"
-description = "Run a subprocess in a pseudo terminal"
-category = "dev"
-optional = false
-python-versions = "*"
-
[[package]]
name = "py"
version = "1.11.0"
@@ -1510,6 +1377,17 @@ category = "dev"
optional = false
python-versions = ">=3.6"
+[[package]]
+name = "pymdown-extensions"
+version = "9.5"
+description = "Extension pack for Python Markdown."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+markdown = ">=3.2"
+
[[package]]
name = "pyparsing"
version = "3.0.9"
@@ -1592,6 +1470,22 @@ python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
[package.dependencies]
six = ">=1.5"
+[[package]]
+name = "pytkdocs"
+version = "0.16.1"
+description = "Load Python objects documentation."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+astunparse = {version = ">=1.6", markers = "python_version < \"3.9\""}
+cached-property = {version = ">=1.5", markers = "python_version < \"3.8\""}
+typing-extensions = {version = ">=3.7", markers = "python_version < \"3.8\""}
+
+[package.extras]
+numpy-style = ["docstring_parser (>=0.7)"]
+
[[package]]
name = "pytorch-lightning"
version = "1.6.4"
@@ -1631,7 +1525,7 @@ name = "pytz"
version = "2022.1"
description = "World timezone definitions, modern and historical"
category = "main"
-optional = false
+optional = true
python-versions = "*"
[[package]]
@@ -1642,14 +1536,6 @@ category = "dev"
optional = false
python-versions = "*"
-[[package]]
-name = "pywinpty"
-version = "2.0.5"
-description = "Pseudo terminal support for Windows from Python."
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
[[package]]
name = "pyyaml"
version = "6.0"
@@ -1659,29 +1545,27 @@ optional = false
python-versions = ">=3.6"
[[package]]
-name = "pyzmq"
-version = "23.2.0"
-description = "Python bindings for 0MQ"
+name = "pyyaml-env-tag"
+version = "0.1"
+description = "A custom YAML tag for referencing environment variables in YAML files. "
category = "dev"
optional = false
python-versions = ">=3.6"
[package.dependencies]
-cffi = {version = "*", markers = "implementation_name == \"pypy\""}
-py = {version = "*", markers = "implementation_name == \"pypy\""}
+pyyaml = "*"
[[package]]
-name = "recommonmark"
-version = "0.7.1"
-description = "A docutils-compatibility bridge to CommonMark, enabling you to write CommonMark inside of Docutils & Sphinx projects."
+name = "pyzmq"
+version = "23.2.0"
+description = "Python bindings for 0MQ"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
[package.dependencies]
-commonmark = ">=0.8.1"
-docutils = ">=0.11"
-sphinx = ">=1.3.1"
+cffi = {version = "*", markers = "implementation_name == \"pypy\""}
+py = {version = "*", markers = "implementation_name == \"pypy\""}
[[package]]
name = "regex"
@@ -1781,19 +1665,6 @@ python-versions = ">=3.7,<3.11"
[package.dependencies]
numpy = ">=1.16.5,<1.23.0"
-[[package]]
-name = "send2trash"
-version = "1.8.0"
-description = "Send file to trash natively under Mac OS X, Windows and Linux."
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.extras]
-nativelib = ["pyobjc-framework-cocoa", "pywin32"]
-objc = ["pyobjc-framework-cocoa"]
-win32 = ["pywin32"]
-
[[package]]
name = "setuptools-scm"
version = "7.0.4"
@@ -1828,14 +1699,6 @@ category = "dev"
optional = false
python-versions = ">=3.6"
-[[package]]
-name = "snowballstemmer"
-version = "2.2.0"
-description = "This package provides 29 stemmers for 28 languages generated from Snowball algorithms."
-category = "dev"
-optional = false
-python-versions = "*"
-
[[package]]
name = "soupsieve"
version = "2.3.2.post1"
@@ -1844,153 +1707,6 @@ category = "dev"
optional = false
python-versions = ">=3.6"
-[[package]]
-name = "sphinx"
-version = "5.0.2"
-description = "Python documentation generator"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-alabaster = ">=0.7,<0.8"
-babel = ">=1.3"
-colorama = {version = ">=0.3.5", markers = "sys_platform == \"win32\""}
-docutils = ">=0.14,<0.19"
-imagesize = "*"
-importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""}
-Jinja2 = ">=2.3"
-packaging = "*"
-Pygments = ">=2.0"
-requests = ">=2.5.0"
-snowballstemmer = ">=1.1"
-sphinxcontrib-applehelp = "*"
-sphinxcontrib-devhelp = "*"
-sphinxcontrib-htmlhelp = ">=2.0.0"
-sphinxcontrib-jsmath = "*"
-sphinxcontrib-qthelp = "*"
-sphinxcontrib-serializinghtml = ">=1.1.5"
-
-[package.extras]
-docs = ["sphinxcontrib-websupport"]
-lint = ["flake8 (>=3.5.0)", "isort", "mypy (>=0.950)", "docutils-stubs", "types-typed-ast", "types-requests"]
-test = ["pytest (>=4.6)", "html5lib", "cython", "typed-ast"]
-
-[[package]]
-name = "sphinx-automodapi"
-version = "0.13"
-description = "Sphinx extension for auto-generating API documentation for entire modules"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-sphinx = ">=1.7"
-
-[package.extras]
-test = ["pytest", "pytest-cov", "cython", "codecov", "coverage (<5.0)"]
-
-[[package]]
-name = "sphinx-copybutton"
-version = "0.4.0"
-description = "Add a copy button to each of your code cells."
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.dependencies]
-sphinx = ">=1.8"
-
-[package.extras]
-code_style = ["pre-commit (==2.12.1)"]
-rtd = ["sphinx", "ipython", "sphinx-book-theme"]
-
-[[package]]
-name = "sphinx-rtd-theme"
-version = "0.5.2"
-description = "Read the Docs theme for Sphinx"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-docutils = "<0.17"
-sphinx = "*"
-
-[package.extras]
-dev = ["transifex-client", "sphinxcontrib-httpdomain", "bump2version"]
-
-[[package]]
-name = "sphinxcontrib-applehelp"
-version = "1.0.2"
-description = "sphinxcontrib-applehelp is a sphinx extension which outputs Apple help books"
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
-[[package]]
-name = "sphinxcontrib-devhelp"
-version = "1.0.2"
-description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
-[[package]]
-name = "sphinxcontrib-htmlhelp"
-version = "2.0.0"
-description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
-category = "dev"
-optional = false
-python-versions = ">=3.6"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest", "html5lib"]
-
-[[package]]
-name = "sphinxcontrib-jsmath"
-version = "1.0.1"
-description = "A sphinx extension which renders display math in HTML via JavaScript"
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-test = ["pytest", "flake8", "mypy"]
-
-[[package]]
-name = "sphinxcontrib-qthelp"
-version = "1.0.3"
-description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
-[[package]]
-name = "sphinxcontrib-serializinghtml"
-version = "1.1.5"
-description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-
-[package.extras]
-lint = ["flake8", "mypy", "docutils-stubs"]
-test = ["pytest"]
-
[[package]]
name = "stevedore"
version = "3.5.0"
@@ -2056,22 +1772,6 @@ category = "main"
optional = true
python-versions = "*"
-[[package]]
-name = "terminado"
-version = "0.15.0"
-description = "Tornado websocket backend for the Xterm.js Javascript terminal emulator library."
-category = "dev"
-optional = false
-python-versions = ">=3.7"
-
-[package.dependencies]
-ptyprocess = {version = "*", markers = "os_name != \"nt\""}
-pywinpty = {version = ">=1.1.0", markers = "os_name == \"nt\""}
-tornado = ">=6.1.0"
-
-[package.extras]
-test = ["pre-commit", "pytest-timeout", "pytest (>=6.0)"]
-
[[package]]
name = "threadpoolctl"
version = "3.1.0"
@@ -2316,12 +2016,15 @@ secure = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "cer
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[[package]]
-name = "wcwidth"
-version = "0.2.5"
-description = "Measures the displayed width of unicode strings in a terminal"
+name = "watchdog"
+version = "2.1.9"
+description = "Filesystem events monitoring"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
+
+[package.extras]
+watchmedo = ["PyYAML (>=3.10)"]
[[package]]
name = "webencodings"
@@ -2342,17 +2045,6 @@ python-versions = ">=3.7"
[package.extras]
watchdog = ["watchdog"]
-[[package]]
-name = "widgetsnbextension"
-version = "3.6.1"
-description = "IPython HTML widgets for Jupyter"
-category = "dev"
-optional = false
-python-versions = "*"
-
-[package.dependencies]
-notebook = ">=4.4.1"
-
[[package]]
name = "xxhash"
version = "3.0.0"
@@ -2392,14 +2084,11 @@ vision = ["torchvision", "lightning-flash"]
[metadata]
lock-version = "1.1"
-python-versions = ">=3.7,<3.10"
-content-hash = "5f70574811c3e0755f78d2aefb0eb8dbf7a188c29497da2aaf5598dff2919a99"
+python-versions = ">=3.7.1,<3.11"
+content-hash = "b078b2c2d31acd58eeebb4073f106d97bffe25443962a34129c5be61c91eb51b"
[metadata.files]
-absl-py = [
- {file = "absl-py-1.1.0.tar.gz", hash = "sha256:3aa39f898329c2156ff525dfa69ce709e42d77aab18bf4917719d6f260aa6a08"},
- {file = "absl_py-1.1.0-py3-none-any.whl", hash = "sha256:db97287655e30336938f8058d2c81ed2be6af1d9b6ebbcd8df1080a6c7fcd24e"},
-]
+absl-py = []
aiohttp = [
{file = "aiohttp-3.8.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1ed0b6477896559f17b9eaeb6d38e07f7f9ffe40b9f0f9627ae8b9926ae260a8"},
{file = "aiohttp-3.8.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7dadf3c307b31e0e61689cbf9e06be7a867c563d5a63ce9dca578f956609abf8"},
@@ -2478,53 +2167,12 @@ aiosignal = [
{file = "aiosignal-1.2.0-py3-none-any.whl", hash = "sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a"},
{file = "aiosignal-1.2.0.tar.gz", hash = "sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2"},
]
-alabaster = [
- {file = "alabaster-0.7.12-py2.py3-none-any.whl", hash = "sha256:446438bdcca0e05bd45ea2de1668c1d9b032e1a9154c2c259092d77031ddd359"},
- {file = "alabaster-0.7.12.tar.gz", hash = "sha256:a661d72d58e6ea8a57f7a86e37d86716863ee5e92788398526d58b26a4e4dc02"},
-]
-appnope = [
- {file = "appnope-0.1.3-py2.py3-none-any.whl", hash = "sha256:265a455292d0bd8a72453494fa24df5a11eb18373a60c7c0430889f22548605e"},
- {file = "appnope-0.1.3.tar.gz", hash = "sha256:02bd91c4de869fbb1e1c50aafc4098827a7a54ab2f39d9dcba6c9547ed920e24"},
-]
-argon2-cffi = [
- {file = "argon2-cffi-21.3.0.tar.gz", hash = "sha256:d384164d944190a7dd7ef22c6aa3ff197da12962bd04b17f64d4e93d934dba5b"},
- {file = "argon2_cffi-21.3.0-py3-none-any.whl", hash = "sha256:8c976986f2c5c0e5000919e6de187906cfd81fb1c72bf9d88c01177e77da7f80"},
-]
-argon2-cffi-bindings = [
- {file = "argon2-cffi-bindings-21.2.0.tar.gz", hash = "sha256:bb89ceffa6c791807d1305ceb77dbfacc5aa499891d2c55661c6459651fc39e3"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:ccb949252cb2ab3a08c02024acb77cfb179492d5701c7cbdbfd776124d4d2367"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9524464572e12979364b7d600abf96181d3541da11e23ddf565a32e70bd4dc0d"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b746dba803a79238e925d9046a63aa26bf86ab2a2fe74ce6b009a1c3f5c8f2ae"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:58ed19212051f49a523abb1dbe954337dc82d947fb6e5a0da60f7c8471a8476c"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:bd46088725ef7f58b5a1ef7ca06647ebaf0eb4baff7d1d0d177c6cc8744abd86"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-musllinux_1_1_i686.whl", hash = "sha256:8cd69c07dd875537a824deec19f978e0f2078fdda07fd5c42ac29668dda5f40f"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f1152ac548bd5b8bcecfb0b0371f082037e47128653df2e8ba6e914d384f3c3e"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-win32.whl", hash = "sha256:603ca0aba86b1349b147cab91ae970c63118a0f30444d4bc80355937c950c082"},
- {file = "argon2_cffi_bindings-21.2.0-cp36-abi3-win_amd64.whl", hash = "sha256:b2ef1c30440dbbcba7a5dc3e319408b59676e2e039e2ae11a8775ecf482b192f"},
- {file = "argon2_cffi_bindings-21.2.0-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:e415e3f62c8d124ee16018e491a009937f8cf7ebf5eb430ffc5de21b900dad93"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:3e385d1c39c520c08b53d63300c3ecc28622f076f4c2b0e6d7e796e9f6502194"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c3e3cc67fdb7d82c4718f19b4e7a87123caf8a93fde7e23cf66ac0337d3cb3f"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6a22ad9800121b71099d0fb0a65323810a15f2e292f2ba450810a7316e128ee5"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f9f8b450ed0547e3d473fdc8612083fd08dd2120d6ac8f73828df9b7d45bb351"},
- {file = "argon2_cffi_bindings-21.2.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:93f9bf70084f97245ba10ee36575f0c3f1e7d7724d67d8e5b08e61787c320ed7"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:3b9ef65804859d335dc6b31582cad2c5166f0c3e7975f324d9ffaa34ee7e6583"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4966ef5848d820776f5f562a7d45fdd70c2f330c961d0d745b784034bd9f48d"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20ef543a89dee4db46a1a6e206cd015360e5a75822f76df533845c3cbaf72670"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ed2937d286e2ad0cc79a7087d3c272832865f779430e0cc2b4f3718d3159b0cb"},
- {file = "argon2_cffi_bindings-21.2.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:5e00316dabdaea0b2dd82d141cc66889ced0cdcbfa599e8b471cf22c620c329a"},
-]
-asteroid-sphinx-theme = [
- {file = "asteroid_sphinx_theme-0.0.3-py2.py3-none-any.whl", hash = "sha256:5939dd3c71ce384f4c03ea715618700410e95dfefbda2f81a6a9a0a1795d712d"},
- {file = "asteroid_sphinx_theme-0.0.3.tar.gz", hash = "sha256:e780466db174cf2ec75cecd5321fc80e2820bc7563434a6a8351a61dcdd03d75"},
-]
+astunparse = []
async-timeout = [
{file = "async-timeout-4.0.2.tar.gz", hash = "sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15"},
{file = "async_timeout-4.0.2-py3-none-any.whl", hash = "sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c"},
]
-asynctest = [
- {file = "asynctest-0.13.0-py3-none-any.whl", hash = "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676"},
- {file = "asynctest-0.13.0.tar.gz", hash = "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac"},
-]
+asynctest = []
atomicwrites = [
{file = "atomicwrites-1.4.0-py2.py3-none-any.whl", hash = "sha256:6d1784dea7c0c8d4a5172b6c620f40b6e4cbfdf96d783691f2e1302a7b88e197"},
{file = "atomicwrites-1.4.0.tar.gz", hash = "sha256:ae70396ad1a434f9c7046fd2dd196fc04b12f9e91ffb859164193be8b6168a7a"},
@@ -2533,129 +2181,21 @@ attrs = [
{file = "attrs-21.4.0-py2.py3-none-any.whl", hash = "sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4"},
{file = "attrs-21.4.0.tar.gz", hash = "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"},
]
-babel = [
- {file = "Babel-2.10.3-py3-none-any.whl", hash = "sha256:ff56f4892c1c4bf0d814575ea23471c230d544203c7748e8c68f0089478d48eb"},
- {file = "Babel-2.10.3.tar.gz", hash = "sha256:7614553711ee97490f732126dc077f8d0ae084ebc6a96e23db1482afabdb2c51"},
-]
-backcall = [
- {file = "backcall-0.2.0-py2.py3-none-any.whl", hash = "sha256:fbbce6a29f263178a1f7915c1940bde0ec2b2a967566fe1c65c1dfb7422bd255"},
- {file = "backcall-0.2.0.tar.gz", hash = "sha256:5cbdbf27be5e7cfadb448baf0aa95508f91f2bbc6c6437cd9cd06e2a4c215e1e"},
-]
-bandit = [
- {file = "bandit-1.7.4-py3-none-any.whl", hash = "sha256:412d3f259dab4077d0e7f0c11f50f650cc7d10db905d98f6520a95a18049658a"},
- {file = "bandit-1.7.4.tar.gz", hash = "sha256:2d63a8c573417bae338962d4b9b06fbc6080f74ecd955a092849e1e65c717bd2"},
-]
+bandit = []
beautifulsoup4 = [
{file = "beautifulsoup4-4.11.1-py3-none-any.whl", hash = "sha256:58d5c3d29f5a36ffeb94f02f0d786cd53014cf9b3b3951d42e0080d8a9498d30"},
{file = "beautifulsoup4-4.11.1.tar.gz", hash = "sha256:ad9aa55b65ef2808eb405f46cf74df7fcb7044d5cbc26487f96eb2ef2e436693"},
]
-black = [
- {file = "black-22.6.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f586c26118bc6e714ec58c09df0157fe2d9ee195c764f630eb0d8e7ccce72e69"},
- {file = "black-22.6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b270a168d69edb8b7ed32c193ef10fd27844e5c60852039599f9184460ce0807"},
- {file = "black-22.6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6797f58943fceb1c461fb572edbe828d811e719c24e03375fd25170ada53825e"},
- {file = "black-22.6.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c85928b9d5f83b23cee7d0efcb310172412fbf7cb9d9ce963bd67fd141781def"},
- {file = "black-22.6.0-cp310-cp310-win_amd64.whl", hash = "sha256:f6fe02afde060bbeef044af7996f335fbe90b039ccf3f5eb8f16df8b20f77666"},
- {file = "black-22.6.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:cfaf3895a9634e882bf9d2363fed5af8888802d670f58b279b0bece00e9a872d"},
- {file = "black-22.6.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:94783f636bca89f11eb5d50437e8e17fbc6a929a628d82304c80fa9cd945f256"},
- {file = "black-22.6.0-cp36-cp36m-win_amd64.whl", hash = "sha256:2ea29072e954a4d55a2ff58971b83365eba5d3d357352a07a7a4df0d95f51c78"},
- {file = "black-22.6.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e439798f819d49ba1c0bd9664427a05aab79bfba777a6db94fd4e56fae0cb849"},
- {file = "black-22.6.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:187d96c5e713f441a5829e77120c269b6514418f4513a390b0499b0987f2ff1c"},
- {file = "black-22.6.0-cp37-cp37m-win_amd64.whl", hash = "sha256:074458dc2f6e0d3dab7928d4417bb6957bb834434516f21514138437accdbe90"},
- {file = "black-22.6.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a218d7e5856f91d20f04e931b6f16d15356db1c846ee55f01bac297a705ca24f"},
- {file = "black-22.6.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:568ac3c465b1c8b34b61cd7a4e349e93f91abf0f9371eda1cf87194663ab684e"},
- {file = "black-22.6.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6c1734ab264b8f7929cef8ae5f900b85d579e6cbfde09d7387da8f04771b51c6"},
- {file = "black-22.6.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c9a3ac16efe9ec7d7381ddebcc022119794872abce99475345c5a61aa18c45ad"},
- {file = "black-22.6.0-cp38-cp38-win_amd64.whl", hash = "sha256:b9fd45787ba8aa3f5e0a0a98920c1012c884622c6c920dbe98dbd05bc7c70fbf"},
- {file = "black-22.6.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7ba9be198ecca5031cd78745780d65a3f75a34b2ff9be5837045dce55db83d1c"},
- {file = "black-22.6.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a3db5b6409b96d9bd543323b23ef32a1a2b06416d525d27e0f67e74f1446c8f2"},
- {file = "black-22.6.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:560558527e52ce8afba936fcce93a7411ab40c7d5fe8c2463e279e843c0328ee"},
- {file = "black-22.6.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b154e6bbde1e79ea3260c4b40c0b7b3109ffcdf7bc4ebf8859169a6af72cd70b"},
- {file = "black-22.6.0-cp39-cp39-win_amd64.whl", hash = "sha256:4af5bc0e1f96be5ae9bd7aaec219c901a94d6caa2484c21983d043371c733fc4"},
- {file = "black-22.6.0-py3-none-any.whl", hash = "sha256:ac609cf8ef5e7115ddd07d85d988d074ed00e10fbc3445aee393e70164a2219c"},
- {file = "black-22.6.0.tar.gz", hash = "sha256:6c6d39e28aed379aec40da1c65434c77d75e65bb59a1e1c283de545fb4e7c6c9"},
-]
-bleach = [
- {file = "bleach-5.0.1-py3-none-any.whl", hash = "sha256:085f7f33c15bd408dd9b17a4ad77c577db66d76203e5984b1bd59baeee948b2a"},
- {file = "bleach-5.0.1.tar.gz", hash = "sha256:0d03255c47eb9bd2f26aa9bb7f2107732e7e8fe195ca2f64709fcf3b0a4a085c"},
-]
-cachetools = [
- {file = "cachetools-5.2.0-py3-none-any.whl", hash = "sha256:f9f17d2aec496a9aa6b76f53e3b614c965223c061982d434d160f930c698a9db"},
- {file = "cachetools-5.2.0.tar.gz", hash = "sha256:6a94c6402995a99c3970cc7e4884bb60b4a8639938157eeed436098bf9831757"},
-]
+black = []
+bleach = []
+cached-property = []
+cachetools = []
certifi = [
{file = "certifi-2022.6.15-py3-none-any.whl", hash = "sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412"},
{file = "certifi-2022.6.15.tar.gz", hash = "sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d"},
]
-cffi = [
- {file = "cffi-1.15.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:a66d3508133af6e8548451b25058d5812812ec3798c886bf38ed24a98216fab2"},
- {file = "cffi-1.15.1-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:470c103ae716238bbe698d67ad020e1db9d9dba34fa5a899b5e21577e6d52ed2"},
- {file = "cffi-1.15.1-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:9ad5db27f9cabae298d151c85cf2bad1d359a1b9c686a275df03385758e2f914"},
- {file = "cffi-1.15.1-cp27-cp27m-win32.whl", hash = "sha256:b3bbeb01c2b273cca1e1e0c5df57f12dce9a4dd331b4fa1635b8bec26350bde3"},
- {file = "cffi-1.15.1-cp27-cp27m-win_amd64.whl", hash = "sha256:e00b098126fd45523dd056d2efba6c5a63b71ffe9f2bbe1a4fe1716e1d0c331e"},
- {file = "cffi-1.15.1-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:d61f4695e6c866a23a21acab0509af1cdfd2c013cf256bbf5b6b5e2695827162"},
- {file = "cffi-1.15.1-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:ed9cb427ba5504c1dc15ede7d516b84757c3e3d7868ccc85121d9310d27eed0b"},
- {file = "cffi-1.15.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:39d39875251ca8f612b6f33e6b1195af86d1b3e60086068be9cc053aa4376e21"},
- {file = "cffi-1.15.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:285d29981935eb726a4399badae8f0ffdff4f5050eaa6d0cfc3f64b857b77185"},
- {file = "cffi-1.15.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3eb6971dcff08619f8d91607cfc726518b6fa2a9eba42856be181c6d0d9515fd"},
- {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21157295583fe8943475029ed5abdcf71eb3911894724e360acff1d61c1d54bc"},
- {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5635bd9cb9731e6d4a1132a498dd34f764034a8ce60cef4f5319c0541159392f"},
- {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2012c72d854c2d03e45d06ae57f40d78e5770d252f195b93f581acf3ba44496e"},
- {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd86c085fae2efd48ac91dd7ccffcfc0571387fe1193d33b6394db7ef31fe2a4"},
- {file = "cffi-1.15.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:fa6693661a4c91757f4412306191b6dc88c1703f780c8234035eac011922bc01"},
- {file = "cffi-1.15.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:59c0b02d0a6c384d453fece7566d1c7e6b7bae4fc5874ef2ef46d56776d61c9e"},
- {file = "cffi-1.15.1-cp310-cp310-win32.whl", hash = "sha256:cba9d6b9a7d64d4bd46167096fc9d2f835e25d7e4c121fb2ddfc6528fb0413b2"},
- {file = "cffi-1.15.1-cp310-cp310-win_amd64.whl", hash = "sha256:ce4bcc037df4fc5e3d184794f27bdaab018943698f4ca31630bc7f84a7b69c6d"},
- {file = "cffi-1.15.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3d08afd128ddaa624a48cf2b859afef385b720bb4b43df214f85616922e6a5ac"},
- {file = "cffi-1.15.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3799aecf2e17cf585d977b780ce79ff0dc9b78d799fc694221ce814c2c19db83"},
- {file = "cffi-1.15.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a591fe9e525846e4d154205572a029f653ada1a78b93697f3b5a8f1f2bc055b9"},
- {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3548db281cd7d2561c9ad9984681c95f7b0e38881201e157833a2342c30d5e8c"},
- {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:91fc98adde3d7881af9b59ed0294046f3806221863722ba7d8d120c575314325"},
- {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:94411f22c3985acaec6f83c6df553f2dbe17b698cc7f8ae751ff2237d96b9e3c"},
- {file = "cffi-1.15.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:03425bdae262c76aad70202debd780501fabeaca237cdfddc008987c0e0f59ef"},
- {file = "cffi-1.15.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cc4d65aeeaa04136a12677d3dd0b1c0c94dc43abac5860ab33cceb42b801c1e8"},
- {file = "cffi-1.15.1-cp311-cp311-win32.whl", hash = "sha256:a0f100c8912c114ff53e1202d0078b425bee3649ae34d7b070e9697f93c5d52d"},
- {file = "cffi-1.15.1-cp311-cp311-win_amd64.whl", hash = "sha256:04ed324bda3cda42b9b695d51bb7d54b680b9719cfab04227cdd1e04e5de3104"},
- {file = "cffi-1.15.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50a74364d85fd319352182ef59c5c790484a336f6db772c1a9231f1c3ed0cbd7"},
- {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e263d77ee3dd201c3a142934a086a4450861778baaeeb45db4591ef65550b0a6"},
- {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cec7d9412a9102bdc577382c3929b337320c4c4c4849f2c5cdd14d7368c5562d"},
- {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4289fc34b2f5316fbb762d75362931e351941fa95fa18789191b33fc4cf9504a"},
- {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:173379135477dc8cac4bc58f45db08ab45d228b3363adb7af79436135d028405"},
- {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6975a3fac6bc83c4a65c9f9fcab9e47019a11d3d2cf7f3c0d03431bf145a941e"},
- {file = "cffi-1.15.1-cp36-cp36m-win32.whl", hash = "sha256:2470043b93ff09bf8fb1d46d1cb756ce6132c54826661a32d4e4d132e1977adf"},
- {file = "cffi-1.15.1-cp36-cp36m-win_amd64.whl", hash = "sha256:30d78fbc8ebf9c92c9b7823ee18eb92f2e6ef79b45ac84db507f52fbe3ec4497"},
- {file = "cffi-1.15.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:198caafb44239b60e252492445da556afafc7d1e3ab7a1fb3f0584ef6d742375"},
- {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5ef34d190326c3b1f822a5b7a45f6c4535e2f47ed06fec77d3d799c450b2651e"},
- {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8102eaf27e1e448db915d08afa8b41d6c7ca7a04b7d73af6514df10a3e74bd82"},
- {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5df2768244d19ab7f60546d0c7c63ce1581f7af8b5de3eb3004b9b6fc8a9f84b"},
- {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8c4917bd7ad33e8eb21e9a5bbba979b49d9a97acb3a803092cbc1133e20343c"},
- {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2642fe3142e4cc4af0799748233ad6da94c62a8bec3a6648bf8ee68b1c7426"},
- {file = "cffi-1.15.1-cp37-cp37m-win32.whl", hash = "sha256:e229a521186c75c8ad9490854fd8bbdd9a0c9aa3a524326b55be83b54d4e0ad9"},
- {file = "cffi-1.15.1-cp37-cp37m-win_amd64.whl", hash = "sha256:a0b71b1b8fbf2b96e41c4d990244165e2c9be83d54962a9a1d118fd8657d2045"},
- {file = "cffi-1.15.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:320dab6e7cb2eacdf0e658569d2575c4dad258c0fcc794f46215e1e39f90f2c3"},
- {file = "cffi-1.15.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e74c6b51a9ed6589199c787bf5f9875612ca4a8a0785fb2d4a84429badaf22a"},
- {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5c84c68147988265e60416b57fc83425a78058853509c1b0629c180094904a5"},
- {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b926aa83d1edb5aa5b427b4053dc420ec295a08e40911296b9eb1b6170f6cca"},
- {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:87c450779d0914f2861b8526e035c5e6da0a3199d8f1add1a665e1cbc6fc6d02"},
- {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f2c9f67e9821cad2e5f480bc8d83b8742896f1242dba247911072d4fa94c192"},
- {file = "cffi-1.15.1-cp38-cp38-win32.whl", hash = "sha256:8b7ee99e510d7b66cdb6c593f21c043c248537a32e0bedf02e01e9553a172314"},
- {file = "cffi-1.15.1-cp38-cp38-win_amd64.whl", hash = "sha256:00a9ed42e88df81ffae7a8ab6d9356b371399b91dbdf0c3cb1e84c03a13aceb5"},
- {file = "cffi-1.15.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:54a2db7b78338edd780e7ef7f9f6c442500fb0d41a5a4ea24fff1c929d5af585"},
- {file = "cffi-1.15.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:fcd131dd944808b5bdb38e6f5b53013c5aa4f334c5cad0c72742f6eba4b73db0"},
- {file = "cffi-1.15.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7473e861101c9e72452f9bf8acb984947aa1661a7704553a9f6e4baa5ba64415"},
- {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c9a799e985904922a4d207a94eae35c78ebae90e128f0c4e521ce339396be9d"},
- {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3bcde07039e586f91b45c88f8583ea7cf7a0770df3a1649627bf598332cb6984"},
- {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33ab79603146aace82c2427da5ca6e58f2b3f2fb5da893ceac0c42218a40be35"},
- {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5d598b938678ebf3c67377cdd45e09d431369c3b1a5b331058c338e201f12b27"},
- {file = "cffi-1.15.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:db0fbb9c62743ce59a9ff687eb5f4afbe77e5e8403d6697f7446e5f609976f76"},
- {file = "cffi-1.15.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:98d85c6a2bef81588d9227dde12db8a7f47f639f4a17c9ae08e773aa9c697bf3"},
- {file = "cffi-1.15.1-cp39-cp39-win32.whl", hash = "sha256:40f4774f5a9d4f5e344f31a32b5096977b5d48560c5592e2f3d2c4374bd543ee"},
- {file = "cffi-1.15.1-cp39-cp39-win_amd64.whl", hash = "sha256:70df4e3b545a17496c9b3f41f5115e69a4f2e77e94e1d2a8e1070bc0c38c8a3c"},
- {file = "cffi-1.15.1.tar.gz", hash = "sha256:d400bfb9a37b1351253cb402671cea7e89bdecc294e8016a707f6d1d8ac934f9"},
-]
-charset-normalizer = [
- {file = "charset-normalizer-2.1.0.tar.gz", hash = "sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413"},
- {file = "charset_normalizer-2.1.0-py3-none-any.whl", hash = "sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5"},
-]
+cffi = []
+charset-normalizer = []
click = [
{file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
{file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"},
@@ -2664,85 +2204,12 @@ colorama = [
{file = "colorama-0.4.5-py2.py3-none-any.whl", hash = "sha256:854bf444933e37f5824ae7bfc1e98d5bce2ebe4160d46b5edf346a89358e99da"},
{file = "colorama-0.4.5.tar.gz", hash = "sha256:e6c6b4334fc50988a639d9b98aa429a0b57da6e17b9a44f0451f930b6967b7a4"},
]
-commonmark = [
- {file = "commonmark-0.9.1-py2.py3-none-any.whl", hash = "sha256:da2f38c92590f83de410ba1a3cbceafbc74fee9def35f9251ba9a971d6d66fd9"},
- {file = "commonmark-0.9.1.tar.gz", hash = "sha256:452f9dc859be7f06631ddcb328b6919c67984aca654e5fefb3914d54691aed60"},
-]
-coverage = [
- {file = "coverage-6.4.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f1d5aa2703e1dab4ae6cf416eb0095304f49d004c39e9db1d86f57924f43006b"},
- {file = "coverage-6.4.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4ce1b258493cbf8aec43e9b50d89982346b98e9ffdfaae8ae5793bc112fb0068"},
- {file = "coverage-6.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83c4e737f60c6936460c5be330d296dd5b48b3963f48634c53b3f7deb0f34ec4"},
- {file = "coverage-6.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:84e65ef149028516c6d64461b95a8dbcfce95cfd5b9eb634320596173332ea84"},
- {file = "coverage-6.4.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f69718750eaae75efe506406c490d6fc5a6161d047206cc63ce25527e8a3adad"},
- {file = "coverage-6.4.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e57816f8ffe46b1df8f12e1b348f06d164fd5219beba7d9433ba79608ef011cc"},
- {file = "coverage-6.4.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:01c5615d13f3dd3aa8543afc069e5319cfa0c7d712f6e04b920431e5c564a749"},
- {file = "coverage-6.4.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:75ab269400706fab15981fd4bd5080c56bd5cc07c3bccb86aab5e1d5a88dc8f4"},
- {file = "coverage-6.4.1-cp310-cp310-win32.whl", hash = "sha256:a7f3049243783df2e6cc6deafc49ea123522b59f464831476d3d1448e30d72df"},
- {file = "coverage-6.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:ee2ddcac99b2d2aec413e36d7a429ae9ebcadf912946b13ffa88e7d4c9b712d6"},
- {file = "coverage-6.4.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fb73e0011b8793c053bfa85e53129ba5f0250fdc0392c1591fd35d915ec75c46"},
- {file = "coverage-6.4.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:106c16dfe494de3193ec55cac9640dd039b66e196e4641fa8ac396181578b982"},
- {file = "coverage-6.4.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:87f4f3df85aa39da00fd3ec4b5abeb7407e82b68c7c5ad181308b0e2526da5d4"},
- {file = "coverage-6.4.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:961e2fb0680b4f5ad63234e0bf55dfb90d302740ae9c7ed0120677a94a1590cb"},
- {file = "coverage-6.4.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:cec3a0f75c8f1031825e19cd86ee787e87cf03e4fd2865c79c057092e69e3a3b"},
- {file = "coverage-6.4.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:129cd05ba6f0d08a766d942a9ed4b29283aff7b2cccf5b7ce279d50796860bb3"},
- {file = "coverage-6.4.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:bf5601c33213d3cb19d17a796f8a14a9eaa5e87629a53979a5981e3e3ae166f6"},
- {file = "coverage-6.4.1-cp37-cp37m-win32.whl", hash = "sha256:269eaa2c20a13a5bf17558d4dc91a8d078c4fa1872f25303dddcbba3a813085e"},
- {file = "coverage-6.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:f02cbbf8119db68455b9d763f2f8737bb7db7e43720afa07d8eb1604e5c5ae28"},
- {file = "coverage-6.4.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:ffa9297c3a453fba4717d06df579af42ab9a28022444cae7fa605af4df612d54"},
- {file = "coverage-6.4.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:145f296d00441ca703a659e8f3eb48ae39fb083baba2d7ce4482fb2723e050d9"},
- {file = "coverage-6.4.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d67d44996140af8b84284e5e7d398e589574b376fb4de8ccd28d82ad8e3bea13"},
- {file = "coverage-6.4.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2bd9a6fc18aab8d2e18f89b7ff91c0f34ff4d5e0ba0b33e989b3cd4194c81fd9"},
- {file = "coverage-6.4.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3384f2a3652cef289e38100f2d037956194a837221edd520a7ee5b42d00cc605"},
- {file = "coverage-6.4.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9b3e07152b4563722be523e8cd0b209e0d1a373022cfbde395ebb6575bf6790d"},
- {file = "coverage-6.4.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:1480ff858b4113db2718848d7b2d1b75bc79895a9c22e76a221b9d8d62496428"},
- {file = "coverage-6.4.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:865d69ae811a392f4d06bde506d531f6a28a00af36f5c8649684a9e5e4a85c83"},
- {file = "coverage-6.4.1-cp38-cp38-win32.whl", hash = "sha256:664a47ce62fe4bef9e2d2c430306e1428ecea207ffd68649e3b942fa8ea83b0b"},
- {file = "coverage-6.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:26dff09fb0d82693ba9e6231248641d60ba606150d02ed45110f9ec26404ed1c"},
- {file = "coverage-6.4.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d9c80df769f5ec05ad21ea34be7458d1dc51ff1fb4b2219e77fe24edf462d6df"},
- {file = "coverage-6.4.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:39ee53946bf009788108b4dd2894bf1349b4e0ca18c2016ffa7d26ce46b8f10d"},
- {file = "coverage-6.4.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f5b66caa62922531059bc5ac04f836860412f7f88d38a476eda0a6f11d4724f4"},
- {file = "coverage-6.4.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fd180ed867e289964404051a958f7cccabdeed423f91a899829264bb7974d3d3"},
- {file = "coverage-6.4.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:84631e81dd053e8a0d4967cedab6db94345f1c36107c71698f746cb2636c63e3"},
- {file = "coverage-6.4.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:8c08da0bd238f2970230c2a0d28ff0e99961598cb2e810245d7fc5afcf1254e8"},
- {file = "coverage-6.4.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d42c549a8f41dc103a8004b9f0c433e2086add8a719da00e246e17cbe4056f72"},
- {file = "coverage-6.4.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:309ce4a522ed5fca432af4ebe0f32b21d6d7ccbb0f5fcc99290e71feba67c264"},
- {file = "coverage-6.4.1-cp39-cp39-win32.whl", hash = "sha256:fdb6f7bd51c2d1714cea40718f6149ad9be6a2ee7d93b19e9f00934c0f2a74d9"},
- {file = "coverage-6.4.1-cp39-cp39-win_amd64.whl", hash = "sha256:342d4aefd1c3e7f620a13f4fe563154d808b69cccef415415aece4c786665397"},
- {file = "coverage-6.4.1-pp36.pp37.pp38-none-any.whl", hash = "sha256:4803e7ccf93230accb928f3a68f00ffa80a88213af98ed338a57ad021ef06815"},
- {file = "coverage-6.4.1.tar.gz", hash = "sha256:4321f075095a096e70aff1d002030ee612b65a205a0a0f5b815280d5dc58100c"},
-]
+coverage = []
cycler = [
{file = "cycler-0.11.0-py3-none-any.whl", hash = "sha256:3a27e95f763a428a739d2add979fa7494c912a32c17c4c38c4d5f082cad165a3"},
{file = "cycler-0.11.0.tar.gz", hash = "sha256:9c87405839a19696e837b3b818fed3f5f69f16f1eec1a1ad77e043dcea9c772f"},
]
-datasets = [
- {file = "datasets-1.18.4-py3-none-any.whl", hash = "sha256:e13695ad7aeda2af4430ac1a0b62def9c4b60bb4cc14dbaa240e6683cac50c49"},
- {file = "datasets-1.18.4.tar.gz", hash = "sha256:8f28a7afc2f894c68cb017335a32812f443fe41bc59c089cbd15d7412d3f7f96"},
-]
-debugpy = [
- {file = "debugpy-1.6.0-cp310-cp310-macosx_10_15_x86_64.whl", hash = "sha256:eb1946efac0c0c3d411cea0b5ac772fbde744109fd9520fb0c5a51979faf05ad"},
- {file = "debugpy-1.6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:e3513399177dd37af4c1332df52da5da1d0c387e5927dc4c0709e26ee7302e8f"},
- {file = "debugpy-1.6.0-cp310-cp310-win32.whl", hash = "sha256:5c492235d6b68f879df3bdbdb01f25c15be15682665517c2c7d0420e5658d71f"},
- {file = "debugpy-1.6.0-cp310-cp310-win_amd64.whl", hash = "sha256:40de9ba137d355538432209d05e0f5fe5d0498dce761c39119ad4b950b51db31"},
- {file = "debugpy-1.6.0-cp37-cp37m-macosx_10_15_x86_64.whl", hash = "sha256:0d383b91efee57dbb923ba20801130cf60450a0eda60bce25bccd937de8e323a"},
- {file = "debugpy-1.6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1ff853e60e77e1c16f85a31adb8360bb2d98ca588d7ed645b7f0985b240bdb5e"},
- {file = "debugpy-1.6.0-cp37-cp37m-win32.whl", hash = "sha256:8e972c717d95f56b6a3a7a29a5ede1ee8f2c3802f6f0e678203b0778eb322bf1"},
- {file = "debugpy-1.6.0-cp37-cp37m-win_amd64.whl", hash = "sha256:a8aaeb53e87225141fda7b9081bd87155c1debc13e2f5a532d341112d1983b65"},
- {file = "debugpy-1.6.0-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:132defb585b518955358321d0f42f6aa815aa15b432be27db654807707c70b2f"},
- {file = "debugpy-1.6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8ee75844242b4537beb5899f3e60a578454d1f136b99e8d57ac424573797b94a"},
- {file = "debugpy-1.6.0-cp38-cp38-win32.whl", hash = "sha256:a65a2499761d47df3e9ea9567109be6e73d412e00ac3ffcf74839f3ddfcdf028"},
- {file = "debugpy-1.6.0-cp38-cp38-win_amd64.whl", hash = "sha256:bd980d533d0ddfc451e03a3bb32acb2900049fec39afc3425b944ebf0889be62"},
- {file = "debugpy-1.6.0-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:245c7789a012f86210847ec7ee9f38c30a30d4c2223c3e111829a76c9006a5d0"},
- {file = "debugpy-1.6.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:0e3aa2368883e83e7b689ddff3cafb595f7b711f6a065886b46a96a7fef874e7"},
- {file = "debugpy-1.6.0-cp39-cp39-win32.whl", hash = "sha256:72bcfa97f3afa0064afc77ab811f48ad4a06ac330f290b675082c24437730366"},
- {file = "debugpy-1.6.0-cp39-cp39-win_amd64.whl", hash = "sha256:30abefefd2ff5a5481162d613cb70e60e2fa80a5eb4c994717c0f008ed25d2e1"},
- {file = "debugpy-1.6.0-py2.py3-none-any.whl", hash = "sha256:4de7777842da7e08652f2776c552070bbdd758557fdec73a15d7be0e4aab95ce"},
- {file = "debugpy-1.6.0.zip", hash = "sha256:7b79c40852991f7b6c3ea65845ed0f5f6b731c37f4f9ad9c61e2ab4bd48a9275"},
-]
-decorator = [
- {file = "decorator-5.1.1-py3-none-any.whl", hash = "sha256:b8c3f85900b9dc423225913c5aace94729fe1fa9763b38939a95226f02d37186"},
- {file = "decorator-5.1.1.tar.gz", hash = "sha256:637996211036b6385ef91435e4fae22989472f9d571faba8927ba8253acbc330"},
-]
+datasets = []
defusedxml = [
{file = "defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61"},
{file = "defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69"},
@@ -2751,14 +2218,8 @@ dill = [
{file = "dill-0.3.5.1-py2.py3-none-any.whl", hash = "sha256:33501d03270bbe410c72639b350e941882a8b0fd55357580fbc873fba0c59302"},
{file = "dill-0.3.5.1.tar.gz", hash = "sha256:d75e41f3eff1eee599d738e76ba8f4ad98ea229db8b085318aa2b3333a208c86"},
]
-docstring-parser = [
- {file = "docstring_parser-0.14.1-py3-none-any.whl", hash = "sha256:14ac6ec1f1ba6905c4d8cb90fd0bc55394f5678183752c90e44812bf28d7a515"},
- {file = "docstring_parser-0.14.1.tar.gz", hash = "sha256:2c77522e31b7c88b1ab457a1f3c9ae38947ad719732260ba77ee8a3deb58622a"},
-]
-docutils = [
- {file = "docutils-0.16-py2.py3-none-any.whl", hash = "sha256:0c5b78adfbf7762415433f5515cd5c9e762339e23369dbe8000d84a4bf4ab3af"},
- {file = "docutils-0.16.tar.gz", hash = "sha256:c2de3a60e9e7d07be26b7f2b00ca0309c207e06c100f9cc2a94931fc75a478fc"},
-]
+docstring-parser = []
+docutils = []
entrypoints = [
{file = "entrypoints-0.4-py3-none-any.whl", hash = "sha256:f174b5ff827504fd3cd97cc3f8649f3693f51538c7e4bdf3ef002c8429d42f9f"},
{file = "entrypoints-0.4.tar.gz", hash = "sha256:b706eddaa9218a19ebcd67b56818f05bb27589b1ca9e8d797b74affad4ccacd4"},
@@ -2771,10 +2232,7 @@ filelock = [
{file = "filelock-3.7.1-py3-none-any.whl", hash = "sha256:37def7b658813cda163b56fc564cdc75e86d338246458c4c28ae84cabefa2404"},
{file = "filelock-3.7.1.tar.gz", hash = "sha256:3a0fd85166ad9dbab54c9aec96737b744106dc5f15c0b09a6744a445299fcf04"},
]
-flake8 = [
- {file = "flake8-3.9.2-py2.py3-none-any.whl", hash = "sha256:bf8fd333346d844f616e8d47905ef3a3384edae6b4e9beb0c5101e25e3110907"},
- {file = "flake8-3.9.2.tar.gz", hash = "sha256:07528381786f2a6237b061f6e96610a4167b226cb926e2aa2b6b1d78057c576b"},
-]
+flake8 = []
fonttools = [
{file = "fonttools-4.33.3-py3-none-any.whl", hash = "sha256:f829c579a8678fa939a1d9e9894d01941db869de44390adb49ce67055a06cc2a"},
{file = "fonttools-4.33.3.zip", hash = "sha256:c0fdcfa8ceebd7c1b2021240bd46ef77aa8e7408cf10434be55df52384865f8e"},
@@ -2844,141 +2302,29 @@ fsspec = [
{file = "fsspec-2022.5.0-py3-none-any.whl", hash = "sha256:2c198c50eb541a80bbd03540b07602c4a957366f3fb416a1f270d34bd4ff0926"},
{file = "fsspec-2022.5.0.tar.gz", hash = "sha256:7a5459c75c44e760fbe6a3ccb1f37e81e023cde7da8ba20401258d877ec483b4"},
]
-gitdb = [
- {file = "gitdb-4.0.9-py3-none-any.whl", hash = "sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd"},
- {file = "gitdb-4.0.9.tar.gz", hash = "sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa"},
-]
-gitpython = [
- {file = "GitPython-3.1.27-py3-none-any.whl", hash = "sha256:5b68b000463593e05ff2b261acff0ff0972df8ab1b70d3cdbd41b546c8b8fc3d"},
- {file = "GitPython-3.1.27.tar.gz", hash = "sha256:1c885ce809e8ba2d88a29befeb385fcea06338d3640712b59ca623c220bb5704"},
-]
-google-auth = [
- {file = "google-auth-2.9.0.tar.gz", hash = "sha256:3b2f9d2f436cc7c3b363d0ac66470f42fede249c3bafcc504e9f0bcbe983cff0"},
- {file = "google_auth-2.9.0-py2.py3-none-any.whl", hash = "sha256:75b3977e7e22784607e074800048f44d6a56df589fb2abe58a11d4d20c97c314"},
-]
-google-auth-oauthlib = [
- {file = "google-auth-oauthlib-0.4.6.tar.gz", hash = "sha256:a90a072f6993f2c327067bf65270046384cda5a8ecb20b94ea9a687f1f233a7a"},
- {file = "google_auth_oauthlib-0.4.6-py2.py3-none-any.whl", hash = "sha256:3f2a6e802eebbb6fb736a370fbf3b055edcb6b52878bf2f26330b5e041316c73"},
-]
-grpcio = [
- {file = "grpcio-1.47.0-cp310-cp310-linux_armv7l.whl", hash = "sha256:544da3458d1d249bb8aed5504adf3e194a931e212017934bf7bfa774dad37fb3"},
- {file = "grpcio-1.47.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:b88bec3f94a16411a1e0336eb69f335f58229e45d4082b12d8e554cedea97586"},
- {file = "grpcio-1.47.0-cp310-cp310-manylinux_2_17_aarch64.whl", hash = "sha256:06c0739dff9e723bca28ec22301f3711d85c2e652d1c8ae938aa0f7ad632ef9a"},
- {file = "grpcio-1.47.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f4508e8abd67ebcccd0fbde6e2b1917ba5d153f3f20c1de385abd8722545e05f"},
- {file = "grpcio-1.47.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e9723784cf264697024778dcf4b7542c851fe14b14681d6268fb984a53f76df1"},
- {file = "grpcio-1.47.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:1bb9afa85e797a646bfcd785309e869e80a375c959b11a17c9680abebacc0cb0"},
- {file = "grpcio-1.47.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4d9ad7122f60157454f74a850d1337ba135146cef6fb7956d78c7194d52db0fe"},
- {file = "grpcio-1.47.0-cp310-cp310-win32.whl", hash = "sha256:0425b5577be202d0a4024536bbccb1b052c47e0766096e6c3a5789ddfd5f400d"},
- {file = "grpcio-1.47.0-cp310-cp310-win_amd64.whl", hash = "sha256:d0d481ff55ea6cc49dab2c8276597bd4f1a84a8745fedb4bc23e12e9fb9d0e45"},
- {file = "grpcio-1.47.0-cp36-cp36m-linux_armv7l.whl", hash = "sha256:5f57b9b61c22537623a5577bf5f2f970dc4e50fac5391090114c6eb3ab5a129f"},
- {file = "grpcio-1.47.0-cp36-cp36m-macosx_10_10_x86_64.whl", hash = "sha256:14d2bc74218986e5edf5527e870b0969d63601911994ebf0dce96288548cf0ef"},
- {file = "grpcio-1.47.0-cp36-cp36m-manylinux_2_17_aarch64.whl", hash = "sha256:c79996ae64dc4d8730782dff0d1daacc8ce7d4c2ba9cef83b6f469f73c0655ce"},
- {file = "grpcio-1.47.0-cp36-cp36m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a24b50810aae90c74bbd901c3f175b9645802d2fbf03eadaf418ddee4c26668"},
- {file = "grpcio-1.47.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:55782a31ec539f15b34ee56f19131fe1430f38a4be022eb30c85e0b0dcf57f11"},
- {file = "grpcio-1.47.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:35dfd981b03a3ec842671d1694fe437ee9f7b9e6a02792157a2793b0eba4f478"},
- {file = "grpcio-1.47.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:664a270d3eac68183ad049665b0f4d0262ec387d5c08c0108dbcfe5b351a8b4d"},
- {file = "grpcio-1.47.0-cp36-cp36m-win32.whl", hash = "sha256:9298d6f2a81f132f72a7e79cbc90a511fffacc75045c2b10050bb87b86c8353d"},
- {file = "grpcio-1.47.0-cp36-cp36m-win_amd64.whl", hash = "sha256:815089435d0f113719eabf105832e4c4fa1726b39ae3fb2ca7861752b0f70570"},
- {file = "grpcio-1.47.0-cp37-cp37m-linux_armv7l.whl", hash = "sha256:7191ffc8bcf8a630c547287ab103e1fdf72b2e0c119e634d8a36055c1d988ad0"},
- {file = "grpcio-1.47.0-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:1ec63bbd09586e5cda1bdc832ae6975d2526d04433a764a1cc866caa399e50d4"},
- {file = "grpcio-1.47.0-cp37-cp37m-manylinux_2_17_aarch64.whl", hash = "sha256:08307dc5a6ac4da03146d6c00f62319e0665b01c6ffe805cfcaa955c17253f9c"},
- {file = "grpcio-1.47.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:668350ea02af018ca945bd629754d47126b366d981ab88e0369b53bc781ffb14"},
- {file = "grpcio-1.47.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64e097dd08bb408afeeaee9a56f75311c9ca5b27b8b0278279dc8eef85fa1051"},
- {file = "grpcio-1.47.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:0d8a7f3eb6f290189f48223a5f4464c99619a9de34200ce80d5092fb268323d2"},
- {file = "grpcio-1.47.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:f89de64d9eb3478b188859214752db50c91a749479011abd99e248550371375f"},
- {file = "grpcio-1.47.0-cp37-cp37m-win32.whl", hash = "sha256:67cd275a651532d28620eef677b97164a5438c5afcfd44b15e8992afa9eb598c"},
- {file = "grpcio-1.47.0-cp37-cp37m-win_amd64.whl", hash = "sha256:f515782b168a4ec6ea241add845ccfebe187fc7b09adf892b3ad9e2592c60af1"},
- {file = "grpcio-1.47.0-cp38-cp38-linux_armv7l.whl", hash = "sha256:91cd292373e85a52c897fa5b4768c895e20a7dc3423449c64f0f96388dd1812e"},
- {file = "grpcio-1.47.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:a278d02272214ec33f046864a24b5f5aab7f60f855de38c525e5b4ef61ec5b48"},
- {file = "grpcio-1.47.0-cp38-cp38-manylinux_2_17_aarch64.whl", hash = "sha256:bfdb8af4801d1c31a18d54b37f4e49bb268d1f485ecf47f70e78d56e04ff37a7"},
- {file = "grpcio-1.47.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9e63e0619a5627edb7a5eb3e9568b9f97e604856ba228cc1d8a9f83ce3d0466e"},
- {file = "grpcio-1.47.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cc34d182c4fd64b6ff8304a606b95e814e4f8ed4b245b6d6cc9607690e3ef201"},
- {file = "grpcio-1.47.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a6b2432ac2353c80a56d9015dfc5c4af60245c719628d4193ecd75ddf9cd248c"},
- {file = "grpcio-1.47.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:fcd5d932842df503eb0bf60f9cc35e6fe732b51f499e78b45234e0be41b0018d"},
- {file = "grpcio-1.47.0-cp38-cp38-win32.whl", hash = "sha256:43857d06b2473b640467467f8f553319b5e819e54be14c86324dad83a0547818"},
- {file = "grpcio-1.47.0-cp38-cp38-win_amd64.whl", hash = "sha256:96cff5a2081db82fb710db6a19dd8f904bdebb927727aaf4d9c427984b79a4c1"},
- {file = "grpcio-1.47.0-cp39-cp39-linux_armv7l.whl", hash = "sha256:68b5e47fcca8481f36ef444842801928e60e30a5b3852c9f4a95f2582d10dcb2"},
- {file = "grpcio-1.47.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:0cd44d78f302ff67f11a8c49b786c7ccbed2cfef6f4fd7bb0c3dc9255415f8f7"},
- {file = "grpcio-1.47.0-cp39-cp39-manylinux_2_17_aarch64.whl", hash = "sha256:4706c78b0c183dca815bbb4ef3e8dd2136ccc8d1699f62c585e75e211ad388f6"},
- {file = "grpcio-1.47.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:324e363bad4d89a8ec7124013371f268d43afd0ac0fdeec1b21c1a101eb7dafb"},
- {file = "grpcio-1.47.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b821403907e865e8377af3eee62f0cb233ea2369ba0fcdce9505ca5bfaf4eeb3"},
- {file = "grpcio-1.47.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:2061dbe41e43b0a5e1fd423e8a7fb3a0cf11d69ce22d0fac21f1a8c704640b12"},
- {file = "grpcio-1.47.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8dbef03853a0dbe457417c5469cb0f9d5bf47401b49d50c7dad3c495663b699b"},
- {file = "grpcio-1.47.0-cp39-cp39-win32.whl", hash = "sha256:090dfa19f41efcbe760ae59b34da4304d4be9a59960c9682b7eab7e0b6748a79"},
- {file = "grpcio-1.47.0-cp39-cp39-win_amd64.whl", hash = "sha256:55cd8b13c5ef22003889f599b8f2930836c6f71cd7cf3fc0196633813dc4f928"},
- {file = "grpcio-1.47.0.tar.gz", hash = "sha256:5dbba95fab9b35957b4977b8904fc1fa56b302f9051eff4d7716ebb0c087f801"},
-]
-h5py = [
- {file = "h5py-3.7.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d77af42cb751ad6cc44f11bae73075a07429a5cf2094dfde2b1e716e059b3911"},
- {file = "h5py-3.7.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:63beb8b7b47d0896c50de6efb9a1eaa81dbe211f3767e7dd7db159cea51ba37a"},
- {file = "h5py-3.7.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:04e2e1e2fc51b8873e972a08d2f89625ef999b1f2d276199011af57bb9fc7851"},
- {file = "h5py-3.7.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f73307c876af49aa869ec5df1818e9bb0bdcfcf8a5ba773cc45a4fba5a286a5c"},
- {file = "h5py-3.7.0-cp310-cp310-win_amd64.whl", hash = "sha256:f514b24cacdd983e61f8d371edac8c1b780c279d0acb8485639e97339c866073"},
- {file = "h5py-3.7.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:43fed4d13743cf02798a9a03a360a88e589d81285e72b83f47d37bb64ed44881"},
- {file = "h5py-3.7.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:c038399ce09a58ff8d89ec3e62f00aa7cb82d14f34e24735b920e2a811a3a426"},
- {file = "h5py-3.7.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:03d64fb86bb86b978928bad923b64419a23e836499ec6363e305ad28afd9d287"},
- {file = "h5py-3.7.0-cp37-cp37m-win_amd64.whl", hash = "sha256:e5b7820b75f9519499d76cc708e27242ccfdd9dfb511d6deb98701961d0445aa"},
- {file = "h5py-3.7.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a9351d729ea754db36d175098361b920573fdad334125f86ac1dd3a083355e20"},
- {file = "h5py-3.7.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6776d896fb90c5938de8acb925e057e2f9f28755f67ec3edcbc8344832616c38"},
- {file = "h5py-3.7.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:0a047fddbe6951bce40e9cde63373c838a978c5e05a011a682db9ba6334b8e85"},
- {file = "h5py-3.7.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0798a9c0ff45f17d0192e4d7114d734cac9f8b2b2c76dd1d923c4d0923f27bb6"},
- {file = "h5py-3.7.0-cp38-cp38-win_amd64.whl", hash = "sha256:0d8de8cb619fc597da7cf8cdcbf3b7ff8c5f6db836568afc7dc16d21f59b2b49"},
- {file = "h5py-3.7.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f084bbe816907dfe59006756f8f2d16d352faff2d107f4ffeb1d8de126fc5dc7"},
- {file = "h5py-3.7.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1fcb11a2dc8eb7ddcae08afd8fae02ba10467753a857fa07a404d700a93f3d53"},
- {file = "h5py-3.7.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:ed43e2cc4f511756fd664fb45d6b66c3cbed4e3bd0f70e29c37809b2ae013c44"},
- {file = "h5py-3.7.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9e7535df5ee3dc3e5d1f408fdfc0b33b46bc9b34db82743c82cd674d8239b9ad"},
- {file = "h5py-3.7.0-cp39-cp39-win_amd64.whl", hash = "sha256:9e2ad2aa000f5b1e73b5dfe22f358ca46bf1a2b6ca394d9659874d7fc251731a"},
- {file = "h5py-3.7.0.tar.gz", hash = "sha256:3fcf37884383c5da64846ab510190720027dca0768def34dd8dcb659dbe5cbf3"},
-]
+ghp-import = []
+gitdb = []
+gitpython = []
+google-auth = []
+google-auth-oauthlib = []
+griffe = []
+grpcio = []
+h5py = []
huggingface-hub = [
{file = "huggingface_hub-0.8.1-py3-none-any.whl", hash = "sha256:a11fb8d696a26f927833d46b7633105fd864fd92a2beb1140cbf1b2f703dedb3"},
{file = "huggingface_hub-0.8.1.tar.gz", hash = "sha256:75c70797da54b849f06c2cbf7ba2217250ee217230b9f65547d5db3c5bd84bb5"},
]
-hypothesis = [
- {file = "hypothesis-4.24.0-py2-none-any.whl", hash = "sha256:965c4bf29103278360c4f41eb4536047075d241880f596a09748f6013bd55659"},
- {file = "hypothesis-4.24.0-py3-none-any.whl", hash = "sha256:b3627548a36d1205213b8bcf961e74c2cfb68f2adfbc58a26d0a1c1f4d52f480"},
- {file = "hypothesis-4.24.0.tar.gz", hash = "sha256:b804bb87c2e963cc5f97e5da383db3a307e73a065eda2d5a9ec49c78583299ae"},
-]
+hypothesis = []
idna = [
{file = "idna-3.3-py3-none-any.whl", hash = "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff"},
{file = "idna-3.3.tar.gz", hash = "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"},
]
-imagesize = [
- {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
- {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
-]
-importlib-metadata = [
- {file = "importlib_metadata-4.12.0-py3-none-any.whl", hash = "sha256:7401a975809ea1fdc658c3aa4f78cc2195a0e019c5cbc4c06122884e9ae80c23"},
- {file = "importlib_metadata-4.12.0.tar.gz", hash = "sha256:637245b8bab2b6502fcbc752cc4b7a6f6243bb02b31c5c26156ad103d3d45670"},
-]
-importlib-resources = [
- {file = "importlib_resources-5.8.0-py3-none-any.whl", hash = "sha256:7952325ffd516c05a8ad0858c74dff2c3343f136fe66a6002b2623dd1d43f223"},
- {file = "importlib_resources-5.8.0.tar.gz", hash = "sha256:568c9f16cb204f9decc8d6d24a572eeea27dacbb4cee9e6b03a8025736769751"},
-]
+importlib-metadata = []
+importlib-resources = []
iniconfig = [
{file = "iniconfig-1.1.1-py2.py3-none-any.whl", hash = "sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3"},
{file = "iniconfig-1.1.1.tar.gz", hash = "sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32"},
]
-ipykernel = [
- {file = "ipykernel-6.15.0-py3-none-any.whl", hash = "sha256:b9ed519a29eb819eb82e87e0d3754088237b233e5c647b8bb0ff23c8c70ed16f"},
- {file = "ipykernel-6.15.0.tar.gz", hash = "sha256:b59f9d9672c3a483494bb75915a2b315e78b833a38b039b1ee36dc28683f0d89"},
-]
-ipython = [
- {file = "ipython-7.34.0-py3-none-any.whl", hash = "sha256:c175d2440a1caff76116eb719d40538fbb316e214eda85c5515c303aacbfb23e"},
- {file = "ipython-7.34.0.tar.gz", hash = "sha256:af3bdb46aa292bce5615b1b2ebc76c2080c5f77f54bda2ec72461317273e7cd6"},
-]
-ipython-genutils = [
- {file = "ipython_genutils-0.2.0-py2.py3-none-any.whl", hash = "sha256:72dd37233799e619666c9f639a9da83c34013a73e8bbc79a7a6348d93c61fab8"},
- {file = "ipython_genutils-0.2.0.tar.gz", hash = "sha256:eb2e116e75ecef9d4d228fdc66af54269afa26ab4463042e33785b887c628ba8"},
-]
-ipywidgets = [
- {file = "ipywidgets-7.7.1-py2.py3-none-any.whl", hash = "sha256:aa1076ab7102b2486ae2607c43c243200a07c17d6093676c419d4b6762489a50"},
- {file = "ipywidgets-7.7.1.tar.gz", hash = "sha256:5f2fa1b7afae1af32c88088c9828ad978de93ddda393d7ed414e553fee93dcab"},
-]
-jedi = [
- {file = "jedi-0.18.1-py2.py3-none-any.whl", hash = "sha256:637c9635fcf47945ceb91cd7f320234a7be540ded6f3e99a50cb6febdfd1ba8d"},
- {file = "jedi-0.18.1.tar.gz", hash = "sha256:74137626a64a99c8eb6ae5832d99b3bdd7d29a3850fe2aa80a4126b2a7d949ab"},
-]
jinja2 = [
{file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
{file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
@@ -2987,14 +2333,8 @@ joblib = [
{file = "joblib-1.1.0-py2.py3-none-any.whl", hash = "sha256:f21f109b3c7ff9d95f8387f752d0d9c34a02aa2f7060c2135f465da0e5160ff6"},
{file = "joblib-1.1.0.tar.gz", hash = "sha256:4158fcecd13733f8be669be0683b96ebdbbd38d23559f54dca7205aea1bf1e35"},
]
-jsonargparse = [
- {file = "jsonargparse-4.10.2-py3-none-any.whl", hash = "sha256:d889c710506386a1aece6ac3ff4b4256512873c4e14ccf954ab3bcd0def11958"},
- {file = "jsonargparse-4.10.2.tar.gz", hash = "sha256:1a620760dec54492f158378523bdb341837578cb5c69b3fa141e36642c3cf308"},
-]
-jsonschema = [
- {file = "jsonschema-4.6.1-py3-none-any.whl", hash = "sha256:5eb781753403847fb320f05e9ab2191725b58c5e7f97f1bed63285ca423159bc"},
- {file = "jsonschema-4.6.1.tar.gz", hash = "sha256:ec2802e6a37517f09d47d9ba107947589ae1d25ff557b925d83a321fc2aa5d3b"},
-]
+jsonargparse = []
+jsonschema = []
jupyter-client = [
{file = "jupyter_client-7.3.4-py3-none-any.whl", hash = "sha256:17d74b0d0a7b24f1c8c527b24fcf4607c56bee542ffe8e3418e50b21e514b621"},
{file = "jupyter_client-7.3.4.tar.gz", hash = "sha256:aa9a6c32054b290374f95f73bb0cae91455c58dfb84f65c8591912b8f65e6d56"},
@@ -3003,18 +2343,11 @@ jupyter-core = [
{file = "jupyter_core-4.10.0-py3-none-any.whl", hash = "sha256:e7f5212177af7ab34179690140f188aa9bf3d322d8155ed972cbded19f55b6f3"},
{file = "jupyter_core-4.10.0.tar.gz", hash = "sha256:a6de44b16b7b31d7271130c71a6792c4040f077011961138afed5e5e73181aec"},
]
-jupyter-sphinx = [
- {file = "jupyter_sphinx-0.3.2-py3-none-any.whl", hash = "sha256:301e36d0fb3007bb5802f6b65b60c24990eb99c983332a2ab6eecff385207dc9"},
- {file = "jupyter_sphinx-0.3.2.tar.gz", hash = "sha256:37fc9408385c45326ac79ca0452fbd7ae2bf0e97842d626d2844d4830e30aaf2"},
-]
jupyterlab-pygments = [
{file = "jupyterlab_pygments-0.2.2-py2.py3-none-any.whl", hash = "sha256:2405800db07c9f770863bcf8049a529c3dd4d3e28536638bd7c1c01d2748309f"},
{file = "jupyterlab_pygments-0.2.2.tar.gz", hash = "sha256:7405d7fde60819d905a9fa8ce89e4cd830e318cdad22a0030f7a901da705585d"},
]
-jupyterlab-widgets = [
- {file = "jupyterlab_widgets-1.1.1-py3-none-any.whl", hash = "sha256:90ab47d99da03a3697074acb23b2975ead1d6171aa41cb2812041a7f2a08177a"},
- {file = "jupyterlab_widgets-1.1.1.tar.gz", hash = "sha256:67d0ef1e407e0c42c8ab60b9d901cd7a4c68923650763f75bf17fb06c1943b79"},
-]
+jupytext = []
kiwisolver = [
{file = "kiwisolver-1.4.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:fd2842a0faed9ab9aba0922c951906132d9384be89690570f0ed18cd4f20e658"},
{file = "kiwisolver-1.4.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:caa59e2cae0e23b1e225447d7a9ddb0f982f42a6a22d497a484dfe62a06f7c0e"},
@@ -3060,14 +2393,9 @@ kiwisolver = [
{file = "kiwisolver-1.4.3-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:130c6c35eded399d3967cf8a542c20b671f5ba85bd6f210f8b939f868360e9eb"},
{file = "kiwisolver-1.4.3.tar.gz", hash = "sha256:ab8a15c2750ae8d53e31f77a94f846d0a00772240f1c12817411fa2344351f86"},
]
-lightning-flash = [
- {file = "lightning-flash-0.7.5.tar.gz", hash = "sha256:f1fdf9d070a40ac8f8eab25600d7d29a1ee53da1153e2b8ba8dade27cd1b82d5"},
- {file = "lightning_flash-0.7.5-py3-none-any.whl", hash = "sha256:0ccfba92bae847fdfccf5f2a93cde667df229fef1be89658701f683635111208"},
-]
-markdown = [
- {file = "Markdown-3.3.7-py3-none-any.whl", hash = "sha256:f5da449a6e1c989a4cea2631aa8ee67caa5a2ef855d551c88f9e309f4634c621"},
- {file = "Markdown-3.3.7.tar.gz", hash = "sha256:cbb516f16218e643d8e0a95b309f77eb118cb138d39a4f27851e6a63581db874"},
-]
+lightning-flash = []
+markdown = []
+markdown-it-py = []
markupsafe = [
{file = "MarkupSafe-2.1.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:86b1f75c4e7c2ac2ccdaec2b9022845dbb81880ca318bb7a0a01fbf7813e3812"},
{file = "MarkupSafe-2.1.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f121a1420d4e173a5d96e47e9a0c0dcff965afdf1626d28de1460815f7c4ee7a"},
@@ -3147,18 +2475,25 @@ matplotlib = [
{file = "matplotlib-3.5.2-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:24173c23d1bcbaed5bf47b8785d27933a1ac26a5d772200a0f3e0e38f471b001"},
{file = "matplotlib-3.5.2.tar.gz", hash = "sha256:48cf850ce14fa18067f2d9e0d646763681948487a8080ec0af2686468b4607a2"},
]
-matplotlib-inline = [
- {file = "matplotlib-inline-0.1.3.tar.gz", hash = "sha256:a04bfba22e0d1395479f866853ec1ee28eea1485c1d69a6faf00dc3e24ff34ee"},
- {file = "matplotlib_inline-0.1.3-py3-none-any.whl", hash = "sha256:aed605ba3b72462d64d475a21a9296f400a19c4f74a31b59103d2a99ffd5aa5c"},
-]
mccabe = [
{file = "mccabe-0.6.1-py2.py3-none-any.whl", hash = "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42"},
{file = "mccabe-0.6.1.tar.gz", hash = "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"},
]
+mdit-py-plugins = []
+mdurl = []
+mergedeep = []
mistune = [
{file = "mistune-0.8.4-py2.py3-none-any.whl", hash = "sha256:88a1051873018da288eee8538d476dffe1262495144b33ecb586c4ab266bb8d4"},
{file = "mistune-0.8.4.tar.gz", hash = "sha256:59a3429db53c50b5c6bcc8a07f8848cb00d7dc8bdb431a4ab41920d201d4756e"},
]
+mkdocs = []
+mkdocs-autorefs = []
+mkdocs-jupyter = []
+mkdocs-material = []
+mkdocs-material-extensions = []
+mkdocstrings = []
+mkdocstrings-python = []
+mkdocstrings-python-legacy = []
multidict = [
{file = "multidict-6.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b9e95a740109c6047602f4db4da9949e6c5945cefbad34a1299775ddc9a62e2"},
{file = "multidict-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac0e27844758d7177989ce406acc6a83c16ed4524ebc363c1f748cba184d89d3"},
@@ -3243,39 +2578,12 @@ multiprocess = [
{file = "multiprocess-0.70.13-py39-none-any.whl", hash = "sha256:00ef48461d43d1e30f8f4b2e1b287ecaaffec325a37053beb5503e0d69e5a3cd"},
{file = "multiprocess-0.70.13.tar.gz", hash = "sha256:2e096dd618a84d15aa369a9cf6695815e5539f853dc8fa4f4b9153b11b1d0b32"},
]
-mypy = [
- {file = "mypy-0.910-cp35-cp35m-macosx_10_9_x86_64.whl", hash = "sha256:a155d80ea6cee511a3694b108c4494a39f42de11ee4e61e72bc424c490e46457"},
- {file = "mypy-0.910-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:b94e4b785e304a04ea0828759172a15add27088520dc7e49ceade7834275bedb"},
- {file = "mypy-0.910-cp35-cp35m-manylinux2010_x86_64.whl", hash = "sha256:088cd9c7904b4ad80bec811053272986611b84221835e079be5bcad029e79dd9"},
- {file = "mypy-0.910-cp35-cp35m-win_amd64.whl", hash = "sha256:adaeee09bfde366d2c13fe6093a7df5df83c9a2ba98638c7d76b010694db760e"},
- {file = "mypy-0.910-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:ecd2c3fe726758037234c93df7e98deb257fd15c24c9180dacf1ef829da5f921"},
- {file = "mypy-0.910-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:d9dd839eb0dc1bbe866a288ba3c1afc33a202015d2ad83b31e875b5905a079b6"},
- {file = "mypy-0.910-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:3e382b29f8e0ccf19a2df2b29a167591245df90c0b5a2542249873b5c1d78212"},
- {file = "mypy-0.910-cp36-cp36m-win_amd64.whl", hash = "sha256:53fd2eb27a8ee2892614370896956af2ff61254c275aaee4c230ae771cadd885"},
- {file = "mypy-0.910-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b6fb13123aeef4a3abbcfd7e71773ff3ff1526a7d3dc538f3929a49b42be03f0"},
- {file = "mypy-0.910-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:e4dab234478e3bd3ce83bac4193b2ecd9cf94e720ddd95ce69840273bf44f6de"},
- {file = "mypy-0.910-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:7df1ead20c81371ccd6091fa3e2878559b5c4d4caadaf1a484cf88d93ca06703"},
- {file = "mypy-0.910-cp37-cp37m-win_amd64.whl", hash = "sha256:0aadfb2d3935988ec3815952e44058a3100499f5be5b28c34ac9d79f002a4a9a"},
- {file = "mypy-0.910-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:ec4e0cd079db280b6bdabdc807047ff3e199f334050db5cbb91ba3e959a67504"},
- {file = "mypy-0.910-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:119bed3832d961f3a880787bf621634ba042cb8dc850a7429f643508eeac97b9"},
- {file = "mypy-0.910-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:866c41f28cee548475f146aa4d39a51cf3b6a84246969f3759cb3e9c742fc072"},
- {file = "mypy-0.910-cp38-cp38-win_amd64.whl", hash = "sha256:ceb6e0a6e27fb364fb3853389607cf7eb3a126ad335790fa1e14ed02fba50811"},
- {file = "mypy-0.910-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1a85e280d4d217150ce8cb1a6dddffd14e753a4e0c3cf90baabb32cefa41b59e"},
- {file = "mypy-0.910-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:42c266ced41b65ed40a282c575705325fa7991af370036d3f134518336636f5b"},
- {file = "mypy-0.910-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:3c4b8ca36877fc75339253721f69603a9c7fdb5d4d5a95a1a1b899d8b86a4de2"},
- {file = "mypy-0.910-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:c0df2d30ed496a08de5daed2a9ea807d07c21ae0ab23acf541ab88c24b26ab97"},
- {file = "mypy-0.910-cp39-cp39-win_amd64.whl", hash = "sha256:c6c2602dffb74867498f86e6129fd52a2770c48b7cd3ece77ada4fa38f94eba8"},
- {file = "mypy-0.910-py3-none-any.whl", hash = "sha256:ef565033fa5a958e62796867b1df10c40263ea9ded87164d67572834e57a174d"},
- {file = "mypy-0.910.tar.gz", hash = "sha256:704098302473cb31a218f1775a873b376b30b4c18229421e9e9dc8916fd16150"},
-]
+mypy = []
mypy-extensions = [
{file = "mypy_extensions-0.4.3-py2.py3-none-any.whl", hash = "sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d"},
{file = "mypy_extensions-0.4.3.tar.gz", hash = "sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"},
]
-nbclient = [
- {file = "nbclient-0.6.6-py3-none-any.whl", hash = "sha256:09bae4ea2df79fa6bc50aeb8278d8b79d2036792824337fa6eee834afae17312"},
- {file = "nbclient-0.6.6.tar.gz", hash = "sha256:0df76a7961d99a681b4796c74a1f2553b9f998851acc01896dce064ad19a9027"},
-]
+nbclient = []
nbconvert = [
{file = "nbconvert-6.5.0-py3-none-any.whl", hash = "sha256:c56dd0b8978a1811a5654f74c727ff16ca87dd5a43abd435a1c49b840fcd8360"},
{file = "nbconvert-6.5.0.tar.gz", hash = "sha256:223e46e27abe8596b8aed54301fadbba433b7ffea8196a68fd7b1ff509eee99d"},
@@ -3284,173 +2592,27 @@ nbformat = [
{file = "nbformat-5.4.0-py3-none-any.whl", hash = "sha256:0d6072aaec95dddc39735c144ee8bbc6589c383fb462e4058abc855348152dad"},
{file = "nbformat-5.4.0.tar.gz", hash = "sha256:44ba5ca6acb80c5d5a500f1e5b83ede8cbe364d5a495c4c8cf60aaf1ba656501"},
]
-nbsphinx = [
- {file = "nbsphinx-0.8.9-py3-none-any.whl", hash = "sha256:a7d743762249ee6bac3350a91eb3717a6e1c75f239f2c2a85491f9aca5a63be1"},
- {file = "nbsphinx-0.8.9.tar.gz", hash = "sha256:4ade86b2a41f8f41efd3ea99dae84c3368fe8ba3f837d50c8815ce9424c5994f"},
-]
nest-asyncio = [
{file = "nest_asyncio-1.5.5-py3-none-any.whl", hash = "sha256:b98e3ec1b246135e4642eceffa5a6c23a3ab12c82ff816a92c612d68205813b2"},
{file = "nest_asyncio-1.5.5.tar.gz", hash = "sha256:e442291cd942698be619823a17a86a5759eabe1f8613084790de189fe9e16d65"},
]
-notebook = [
- {file = "notebook-6.4.12-py3-none-any.whl", hash = "sha256:8c07a3bb7640e371f8a609bdbb2366a1976c6a2589da8ef917f761a61e3ad8b1"},
- {file = "notebook-6.4.12.tar.gz", hash = "sha256:6268c9ec9048cff7a45405c990c29ac9ca40b0bc3ec29263d218c5e01f2b4e86"},
-]
-numpy = [
- {file = "numpy-1.21.6-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:8737609c3bbdd48e380d463134a35ffad3b22dc56295eff6f79fd85bd0eeeb25"},
- {file = "numpy-1.21.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:fdffbfb6832cd0b300995a2b08b8f6fa9f6e856d562800fea9182316d99c4e8e"},
- {file = "numpy-1.21.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3820724272f9913b597ccd13a467cc492a0da6b05df26ea09e78b171a0bb9da6"},
- {file = "numpy-1.21.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f17e562de9edf691a42ddb1eb4a5541c20dd3f9e65b09ded2beb0799c0cf29bb"},
- {file = "numpy-1.21.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f30427731561ce75d7048ac254dbe47a2ba576229250fb60f0fb74db96501a1"},
- {file = "numpy-1.21.6-cp310-cp310-win32.whl", hash = "sha256:d4bf4d43077db55589ffc9009c0ba0a94fa4908b9586d6ccce2e0b164c86303c"},
- {file = "numpy-1.21.6-cp310-cp310-win_amd64.whl", hash = "sha256:d136337ae3cc69aa5e447e78d8e1514be8c3ec9b54264e680cf0b4bd9011574f"},
- {file = "numpy-1.21.6-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:6aaf96c7f8cebc220cdfc03f1d5a31952f027dda050e5a703a0d1c396075e3e7"},
- {file = "numpy-1.21.6-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:67c261d6c0a9981820c3a149d255a76918278a6b03b6a036800359aba1256d46"},
- {file = "numpy-1.21.6-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:a6be4cb0ef3b8c9250c19cc122267263093eee7edd4e3fa75395dfda8c17a8e2"},
- {file = "numpy-1.21.6-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7c4068a8c44014b2d55f3c3f574c376b2494ca9cc73d2f1bd692382b6dffe3db"},
- {file = "numpy-1.21.6-cp37-cp37m-win32.whl", hash = "sha256:7c7e5fa88d9ff656e067876e4736379cc962d185d5cd808014a8a928d529ef4e"},
- {file = "numpy-1.21.6-cp37-cp37m-win_amd64.whl", hash = "sha256:bcb238c9c96c00d3085b264e5c1a1207672577b93fa666c3b14a45240b14123a"},
- {file = "numpy-1.21.6-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:82691fda7c3f77c90e62da69ae60b5ac08e87e775b09813559f8901a88266552"},
- {file = "numpy-1.21.6-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:643843bcc1c50526b3a71cd2ee561cf0d8773f062c8cbaf9ffac9fdf573f83ab"},
- {file = "numpy-1.21.6-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:357768c2e4451ac241465157a3e929b265dfac85d9214074985b1786244f2ef3"},
- {file = "numpy-1.21.6-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:9f411b2c3f3d76bba0865b35a425157c5dcf54937f82bbeb3d3c180789dd66a6"},
- {file = "numpy-1.21.6-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4aa48afdce4660b0076a00d80afa54e8a97cd49f457d68a4342d188a09451c1a"},
- {file = "numpy-1.21.6-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d6a96eef20f639e6a97d23e57dd0c1b1069a7b4fd7027482a4c5c451cd7732f4"},
- {file = "numpy-1.21.6-cp38-cp38-win32.whl", hash = "sha256:5c3c8def4230e1b959671eb959083661b4a0d2e9af93ee339c7dada6759a9470"},
- {file = "numpy-1.21.6-cp38-cp38-win_amd64.whl", hash = "sha256:bf2ec4b75d0e9356edea834d1de42b31fe11f726a81dfb2c2112bc1eaa508fcf"},
- {file = "numpy-1.21.6-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:4391bd07606be175aafd267ef9bea87cf1b8210c787666ce82073b05f202add1"},
- {file = "numpy-1.21.6-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:67f21981ba2f9d7ba9ade60c9e8cbaa8cf8e9ae51673934480e45cf55e953673"},
- {file = "numpy-1.21.6-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ee5ec40fdd06d62fe5d4084bef4fd50fd4bb6bfd2bf519365f569dc470163ab0"},
- {file = "numpy-1.21.6-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:1dbe1c91269f880e364526649a52eff93ac30035507ae980d2fed33aaee633ac"},
- {file = "numpy-1.21.6-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:d9caa9d5e682102453d96a0ee10c7241b72859b01a941a397fd965f23b3e016b"},
- {file = "numpy-1.21.6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:58459d3bad03343ac4b1b42ed14d571b8743dc80ccbf27444f266729df1d6f5b"},
- {file = "numpy-1.21.6-cp39-cp39-win32.whl", hash = "sha256:7f5ae4f304257569ef3b948810816bc87c9146e8c446053539947eedeaa32786"},
- {file = "numpy-1.21.6-cp39-cp39-win_amd64.whl", hash = "sha256:e31f0bb5928b793169b87e3d1e070f2342b22d5245c755e2b81caa29756246c3"},
- {file = "numpy-1.21.6-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:dd1c8f6bd65d07d3810b90d02eba7997e32abbdf1277a481d698969e921a3be0"},
- {file = "numpy-1.21.6.zip", hash = "sha256:ecb55251139706669fdec2ff073c98ef8e9a84473e51e716211b41aa0f18e656"},
-]
-numpydoc = [
- {file = "numpydoc-1.4.0-py3-none-any.whl", hash = "sha256:fd26258868ebcc75c816fe68e1d41e3b55bd410941acfb969dee3eef6e5cf260"},
- {file = "numpydoc-1.4.0.tar.gz", hash = "sha256:9494daf1c7612f59905fa09e65c9b8a90bbacb3804d91f7a94e778831e6fcfa5"},
-]
-oauthlib = [
- {file = "oauthlib-3.2.0-py3-none-any.whl", hash = "sha256:6db33440354787f9b7f3a6dbd4febf5d0f93758354060e802f6c06cb493022fe"},
- {file = "oauthlib-3.2.0.tar.gz", hash = "sha256:23a8208d75b902797ea29fd31fa80a15ed9dc2c6c16fe73f5d346f83f6fa27a2"},
-]
+numpy = []
+oauthlib = []
packaging = [
{file = "packaging-21.3-py3-none-any.whl", hash = "sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522"},
{file = "packaging-21.3.tar.gz", hash = "sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb"},
]
-pandas = [
- {file = "pandas-1.1.5-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:bf23a3b54d128b50f4f9d4675b3c1857a688cc6731a32f931837d72effb2698d"},
- {file = "pandas-1.1.5-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:5a780260afc88268a9d3ac3511d8f494fdcf637eece62fb9eb656a63d53eb7ca"},
- {file = "pandas-1.1.5-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:b61080750d19a0122469ab59b087380721d6b72a4e7d962e4d7e63e0c4504814"},
- {file = "pandas-1.1.5-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:0de3ddb414d30798cbf56e642d82cac30a80223ad6fe484d66c0ce01a84d6f2f"},
- {file = "pandas-1.1.5-cp36-cp36m-win32.whl", hash = "sha256:70865f96bb38fec46f7ebd66d4b5cfd0aa6b842073f298d621385ae3898d28b5"},
- {file = "pandas-1.1.5-cp36-cp36m-win_amd64.whl", hash = "sha256:19a2148a1d02791352e9fa637899a78e371a3516ac6da5c4edc718f60cbae648"},
- {file = "pandas-1.1.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:26fa92d3ac743a149a31b21d6f4337b0594b6302ea5575b37af9ca9611e8981a"},
- {file = "pandas-1.1.5-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:c16d59c15d946111d2716856dd5479221c9e4f2f5c7bc2d617f39d870031e086"},
- {file = "pandas-1.1.5-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:3be7a7a0ca71a2640e81d9276f526bca63505850add10206d0da2e8a0a325dae"},
- {file = "pandas-1.1.5-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:573fba5b05bf2c69271a32e52399c8de599e4a15ab7cec47d3b9c904125ab788"},
- {file = "pandas-1.1.5-cp37-cp37m-win32.whl", hash = "sha256:21b5a2b033380adbdd36b3116faaf9a4663e375325831dac1b519a44f9e439bb"},
- {file = "pandas-1.1.5-cp37-cp37m-win_amd64.whl", hash = "sha256:24c7f8d4aee71bfa6401faeba367dd654f696a77151a8a28bc2013f7ced4af98"},
- {file = "pandas-1.1.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2860a97cbb25444ffc0088b457da0a79dc79f9c601238a3e0644312fcc14bf11"},
- {file = "pandas-1.1.5-cp38-cp38-manylinux1_i686.whl", hash = "sha256:5008374ebb990dad9ed48b0f5d0038124c73748f5384cc8c46904dace27082d9"},
- {file = "pandas-1.1.5-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:2c2f7c670ea4e60318e4b7e474d56447cf0c7d83b3c2a5405a0dbb2600b9c48e"},
- {file = "pandas-1.1.5-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:0a643bae4283a37732ddfcecab3f62dd082996021b980f580903f4e8e01b3c5b"},
- {file = "pandas-1.1.5-cp38-cp38-win32.whl", hash = "sha256:5447ea7af4005b0daf695a316a423b96374c9c73ffbd4533209c5ddc369e644b"},
- {file = "pandas-1.1.5-cp38-cp38-win_amd64.whl", hash = "sha256:4c62e94d5d49db116bef1bd5c2486723a292d79409fc9abd51adf9e05329101d"},
- {file = "pandas-1.1.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:731568be71fba1e13cae212c362f3d2ca8932e83cb1b85e3f1b4dd77d019254a"},
- {file = "pandas-1.1.5-cp39-cp39-manylinux1_i686.whl", hash = "sha256:c61c043aafb69329d0f961b19faa30b1dab709dd34c9388143fc55680059e55a"},
- {file = "pandas-1.1.5-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:2b1c6cd28a0dfda75c7b5957363333f01d370936e4c6276b7b8e696dd500582a"},
- {file = "pandas-1.1.5-cp39-cp39-win32.whl", hash = "sha256:c94ff2780a1fd89f190390130d6d36173ca59fcfb3fe0ff596f9a56518191ccb"},
- {file = "pandas-1.1.5-cp39-cp39-win_amd64.whl", hash = "sha256:edda9bacc3843dfbeebaf7a701763e68e741b08fccb889c003b0a52f0ee95782"},
- {file = "pandas-1.1.5.tar.gz", hash = "sha256:f10fc41ee3c75a474d3bdf68d396f10782d013d7f67db99c0efbfd0acb99701b"},
-]
+pandas = []
pandocfilters = [
{file = "pandocfilters-1.5.0-py2.py3-none-any.whl", hash = "sha256:33aae3f25fd1a026079f5d27bdd52496f0e0803b3469282162bafdcbdf6ef14f"},
{file = "pandocfilters-1.5.0.tar.gz", hash = "sha256:0b679503337d233b4339a817bfc8c50064e2eff681314376a47cb582305a7a38"},
]
-parso = [
- {file = "parso-0.8.3-py2.py3-none-any.whl", hash = "sha256:c001d4636cd3aecdaf33cbb40aebb59b094be2a74c556778ef5576c175e19e75"},
- {file = "parso-0.8.3.tar.gz", hash = "sha256:8c07be290bb59f03588915921e29e8a50002acaf2cdc5fa0e0114f91709fafa0"},
-]
pathspec = [
{file = "pathspec-0.9.0-py2.py3-none-any.whl", hash = "sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a"},
{file = "pathspec-0.9.0.tar.gz", hash = "sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1"},
]
-pbr = [
- {file = "pbr-5.9.0-py2.py3-none-any.whl", hash = "sha256:e547125940bcc052856ded43be8e101f63828c2d94239ffbe2b327ba3d5ccf0a"},
- {file = "pbr-5.9.0.tar.gz", hash = "sha256:e8dca2f4b43560edef58813969f52a56cef023146cbb8931626db80e6c1c4308"},
-]
-pexpect = [
- {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
- {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
-]
-pickleshare = [
- {file = "pickleshare-0.7.5-py2.py3-none-any.whl", hash = "sha256:9649af414d74d4df115d5d718f82acb59c9d418196b7b4290ed47a12ce62df56"},
- {file = "pickleshare-0.7.5.tar.gz", hash = "sha256:87683d47965c1da65cdacaf31c8441d12b8044cdec9aca500cd78fc2c683afca"},
-]
-pillow = [
- {file = "Pillow-9.2.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:a9c9bc489f8ab30906d7a85afac4b4944a572a7432e00698a7239f44a44e6efb"},
- {file = "Pillow-9.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:510cef4a3f401c246cfd8227b300828715dd055463cdca6176c2e4036df8bd4f"},
- {file = "Pillow-9.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7888310f6214f19ab2b6df90f3f06afa3df7ef7355fc025e78a3044737fab1f5"},
- {file = "Pillow-9.2.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:831e648102c82f152e14c1a0938689dbb22480c548c8d4b8b248b3e50967b88c"},
- {file = "Pillow-9.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1cc1d2451e8a3b4bfdb9caf745b58e6c7a77d2e469159b0d527a4554d73694d1"},
- {file = "Pillow-9.2.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:136659638f61a251e8ed3b331fc6ccd124590eeff539de57c5f80ef3a9594e58"},
- {file = "Pillow-9.2.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:6e8c66f70fb539301e064f6478d7453e820d8a2c631da948a23384865cd95544"},
- {file = "Pillow-9.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:37ff6b522a26d0538b753f0b4e8e164fdada12db6c6f00f62145d732d8a3152e"},
- {file = "Pillow-9.2.0-cp310-cp310-win32.whl", hash = "sha256:c79698d4cd9318d9481d89a77e2d3fcaeff5486be641e60a4b49f3d2ecca4e28"},
- {file = "Pillow-9.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:254164c57bab4b459f14c64e93df11eff5ded575192c294a0c49270f22c5d93d"},
- {file = "Pillow-9.2.0-cp311-cp311-macosx_10_10_universal2.whl", hash = "sha256:408673ed75594933714482501fe97e055a42996087eeca7e5d06e33218d05aa8"},
- {file = "Pillow-9.2.0-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:727dd1389bc5cb9827cbd1f9d40d2c2a1a0c9b32dd2261db522d22a604a6eec9"},
- {file = "Pillow-9.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50dff9cc21826d2977ef2d2a205504034e3a4563ca6f5db739b0d1026658e004"},
- {file = "Pillow-9.2.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cb6259196a589123d755380b65127ddc60f4c64b21fc3bb46ce3a6ea663659b0"},
- {file = "Pillow-9.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b0554af24df2bf96618dac71ddada02420f946be943b181108cac55a7a2dcd4"},
- {file = "Pillow-9.2.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:15928f824870535c85dbf949c09d6ae7d3d6ac2d6efec80f3227f73eefba741c"},
- {file = "Pillow-9.2.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:bdd0de2d64688ecae88dd8935012c4a72681e5df632af903a1dca8c5e7aa871a"},
- {file = "Pillow-9.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5b87da55a08acb586bad5c3aa3b86505f559b84f39035b233d5bf844b0834b1"},
- {file = "Pillow-9.2.0-cp311-cp311-win32.whl", hash = "sha256:b6d5e92df2b77665e07ddb2e4dbd6d644b78e4c0d2e9272a852627cdba0d75cf"},
- {file = "Pillow-9.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:6bf088c1ce160f50ea40764f825ec9b72ed9da25346216b91361eef8ad1b8f8c"},
- {file = "Pillow-9.2.0-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:2c58b24e3a63efd22554c676d81b0e57f80e0a7d3a5874a7e14ce90ec40d3069"},
- {file = "Pillow-9.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eef7592281f7c174d3d6cbfbb7ee5984a671fcd77e3fc78e973d492e9bf0eb3f"},
- {file = "Pillow-9.2.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dcd7b9c7139dc8258d164b55696ecd16c04607f1cc33ba7af86613881ffe4ac8"},
- {file = "Pillow-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a138441e95562b3c078746a22f8fca8ff1c22c014f856278bdbdd89ca36cff1b"},
- {file = "Pillow-9.2.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:93689632949aff41199090eff5474f3990b6823404e45d66a5d44304e9cdc467"},
- {file = "Pillow-9.2.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:f3fac744f9b540148fa7715a435d2283b71f68bfb6d4aae24482a890aed18b59"},
- {file = "Pillow-9.2.0-cp37-cp37m-win32.whl", hash = "sha256:fa768eff5f9f958270b081bb33581b4b569faabf8774726b283edb06617101dc"},
- {file = "Pillow-9.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:69bd1a15d7ba3694631e00df8de65a8cb031911ca11f44929c97fe05eb9b6c1d"},
- {file = "Pillow-9.2.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:030e3460861488e249731c3e7ab59b07c7853838ff3b8e16aac9561bb345da14"},
- {file = "Pillow-9.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:74a04183e6e64930b667d321524e3c5361094bb4af9083db5c301db64cd341f3"},
- {file = "Pillow-9.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2d33a11f601213dcd5718109c09a52c2a1c893e7461f0be2d6febc2879ec2402"},
- {file = "Pillow-9.2.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1fd6f5e3c0e4697fa7eb45b6e93996299f3feee73a3175fa451f49a74d092b9f"},
- {file = "Pillow-9.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a647c0d4478b995c5e54615a2e5360ccedd2f85e70ab57fbe817ca613d5e63b8"},
- {file = "Pillow-9.2.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:4134d3f1ba5f15027ff5c04296f13328fecd46921424084516bdb1b2548e66ff"},
- {file = "Pillow-9.2.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:bc431b065722a5ad1dfb4df354fb9333b7a582a5ee39a90e6ffff688d72f27a1"},
- {file = "Pillow-9.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:1536ad017a9f789430fb6b8be8bf99d2f214c76502becc196c6f2d9a75b01b76"},
- {file = "Pillow-9.2.0-cp38-cp38-win32.whl", hash = "sha256:2ad0d4df0f5ef2247e27fc790d5c9b5a0af8ade9ba340db4a73bb1a4a3e5fb4f"},
- {file = "Pillow-9.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:ec52c351b35ca269cb1f8069d610fc45c5bd38c3e91f9ab4cbbf0aebc136d9c8"},
- {file = "Pillow-9.2.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:0ed2c4ef2451de908c90436d6e8092e13a43992f1860275b4d8082667fbb2ffc"},
- {file = "Pillow-9.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4ad2f835e0ad81d1689f1b7e3fbac7b01bb8777d5a985c8962bedee0cc6d43da"},
- {file = "Pillow-9.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea98f633d45f7e815db648fd7ff0f19e328302ac36427343e4432c84432e7ff4"},
- {file = "Pillow-9.2.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7761afe0126d046974a01e030ae7529ed0ca6a196de3ec6937c11df0df1bc91c"},
- {file = "Pillow-9.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9a54614049a18a2d6fe156e68e188da02a046a4a93cf24f373bffd977e943421"},
- {file = "Pillow-9.2.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:5aed7dde98403cd91d86a1115c78d8145c83078e864c1de1064f52e6feb61b20"},
- {file = "Pillow-9.2.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:13b725463f32df1bfeacbf3dd197fb358ae8ebcd8c5548faa75126ea425ccb60"},
- {file = "Pillow-9.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:808add66ea764ed97d44dda1ac4f2cfec4c1867d9efb16a33d158be79f32b8a4"},
- {file = "Pillow-9.2.0-cp39-cp39-win32.whl", hash = "sha256:337a74fd2f291c607d220c793a8135273c4c2ab001b03e601c36766005f36885"},
- {file = "Pillow-9.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:fac2d65901fb0fdf20363fbd345c01958a742f2dc62a8dd4495af66e3ff502a4"},
- {file = "Pillow-9.2.0-pp37-pypy37_pp73-macosx_10_10_x86_64.whl", hash = "sha256:ad2277b185ebce47a63f4dc6302e30f05762b688f8dc3de55dbae4651872cdf3"},
- {file = "Pillow-9.2.0-pp37-pypy37_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7c7b502bc34f6e32ba022b4a209638f9e097d7a9098104ae420eb8186217ebbb"},
- {file = "Pillow-9.2.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d1f14f5f691f55e1b47f824ca4fdcb4b19b4323fe43cc7bb105988cad7496be"},
- {file = "Pillow-9.2.0-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:dfe4c1fedfde4e2fbc009d5ad420647f7730d719786388b7de0999bf32c0d9fd"},
- {file = "Pillow-9.2.0-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:f07f1f00e22b231dd3d9b9208692042e29792d6bd4f6639415d2f23158a80013"},
- {file = "Pillow-9.2.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1802f34298f5ba11d55e5bb09c31997dc0c6aed919658dfdf0198a2fe75d5490"},
- {file = "Pillow-9.2.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17d4cafe22f050b46d983b71c707162d63d796a1235cdf8b9d7a112e97b15bac"},
- {file = "Pillow-9.2.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:96b5e6874431df16aee0c1ba237574cb6dff1dcb173798faa6a9d8b399a05d0e"},
- {file = "Pillow-9.2.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:0030fdbd926fb85844b8b92e2f9449ba89607231d3dd597a21ae72dc7fe26927"},
- {file = "Pillow-9.2.0.tar.gz", hash = "sha256:75e636fd3e0fb872693f23ccb8a5ff2cd578801251f3a4f6854c6a5d437d3c04"},
-]
+pbr = []
+pillow = []
platformdirs = [
{file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
{file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
@@ -3459,80 +2621,7 @@ pluggy = [
{file = "pluggy-1.0.0-py2.py3-none-any.whl", hash = "sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3"},
{file = "pluggy-1.0.0.tar.gz", hash = "sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159"},
]
-prometheus-client = [
- {file = "prometheus_client-0.14.1-py3-none-any.whl", hash = "sha256:522fded625282822a89e2773452f42df14b5a8e84a86433e3f8a189c1d54dc01"},
- {file = "prometheus_client-0.14.1.tar.gz", hash = "sha256:5459c427624961076277fdc6dc50540e2bacb98eebde99886e59ec55ed92093a"},
-]
-prompt-toolkit = [
- {file = "prompt_toolkit-3.0.30-py3-none-any.whl", hash = "sha256:d8916d3f62a7b67ab353a952ce4ced6a1d2587dfe9ef8ebc30dd7c386751f289"},
- {file = "prompt_toolkit-3.0.30.tar.gz", hash = "sha256:859b283c50bde45f5f97829f77a4674d1c1fcd88539364f1b28a37805cfd89c0"},
-]
-protobuf = [
- {file = "protobuf-3.19.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f51d5a9f137f7a2cec2d326a74b6e3fc79d635d69ffe1b036d39fc7d75430d37"},
- {file = "protobuf-3.19.4-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:09297b7972da685ce269ec52af761743714996b4381c085205914c41fcab59fb"},
- {file = "protobuf-3.19.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:072fbc78d705d3edc7ccac58a62c4c8e0cec856987da7df8aca86e647be4e35c"},
- {file = "protobuf-3.19.4-cp310-cp310-win32.whl", hash = "sha256:7bb03bc2873a2842e5ebb4801f5c7ff1bfbdf426f85d0172f7644fcda0671ae0"},
- {file = "protobuf-3.19.4-cp310-cp310-win_amd64.whl", hash = "sha256:f358aa33e03b7a84e0d91270a4d4d8f5df6921abe99a377828839e8ed0c04e07"},
- {file = "protobuf-3.19.4-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:1c91ef4110fdd2c590effb5dca8fdbdcb3bf563eece99287019c4204f53d81a4"},
- {file = "protobuf-3.19.4-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c438268eebb8cf039552897d78f402d734a404f1360592fef55297285f7f953f"},
- {file = "protobuf-3.19.4-cp36-cp36m-win32.whl", hash = "sha256:835a9c949dc193953c319603b2961c5c8f4327957fe23d914ca80d982665e8ee"},
- {file = "protobuf-3.19.4-cp36-cp36m-win_amd64.whl", hash = "sha256:4276cdec4447bd5015453e41bdc0c0c1234eda08420b7c9a18b8d647add51e4b"},
- {file = "protobuf-3.19.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:6cbc312be5e71869d9d5ea25147cdf652a6781cf4d906497ca7690b7b9b5df13"},
- {file = "protobuf-3.19.4-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:54a1473077f3b616779ce31f477351a45b4fef8c9fd7892d6d87e287a38df368"},
- {file = "protobuf-3.19.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:435bb78b37fc386f9275a7035fe4fb1364484e38980d0dd91bc834a02c5ec909"},
- {file = "protobuf-3.19.4-cp37-cp37m-win32.whl", hash = "sha256:16f519de1313f1b7139ad70772e7db515b1420d208cb16c6d7858ea989fc64a9"},
- {file = "protobuf-3.19.4-cp37-cp37m-win_amd64.whl", hash = "sha256:cdc076c03381f5c1d9bb1abdcc5503d9ca8b53cf0a9d31a9f6754ec9e6c8af0f"},
- {file = "protobuf-3.19.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:69da7d39e39942bd52848438462674c463e23963a1fdaa84d88df7fbd7e749b2"},
- {file = "protobuf-3.19.4-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:48ed3877fa43e22bcacc852ca76d4775741f9709dd9575881a373bd3e85e54b2"},
- {file = "protobuf-3.19.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd95d1dfb9c4f4563e6093a9aa19d9c186bf98fa54da5252531cc0d3a07977e7"},
- {file = "protobuf-3.19.4-cp38-cp38-win32.whl", hash = "sha256:b38057450a0c566cbd04890a40edf916db890f2818e8682221611d78dc32ae26"},
- {file = "protobuf-3.19.4-cp38-cp38-win_amd64.whl", hash = "sha256:7ca7da9c339ca8890d66958f5462beabd611eca6c958691a8fe6eccbd1eb0c6e"},
- {file = "protobuf-3.19.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:36cecbabbda242915529b8ff364f2263cd4de7c46bbe361418b5ed859677ba58"},
- {file = "protobuf-3.19.4-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:c1068287025f8ea025103e37d62ffd63fec8e9e636246b89c341aeda8a67c934"},
- {file = "protobuf-3.19.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:96bd766831596d6014ca88d86dc8fe0fb2e428c0b02432fd9db3943202bf8c5e"},
- {file = "protobuf-3.19.4-cp39-cp39-win32.whl", hash = "sha256:84123274d982b9e248a143dadd1b9815049f4477dc783bf84efe6250eb4b836a"},
- {file = "protobuf-3.19.4-cp39-cp39-win_amd64.whl", hash = "sha256:3112b58aac3bac9c8be2b60a9daf6b558ca3f7681c130dcdd788ade7c9ffbdca"},
- {file = "protobuf-3.19.4-py2.py3-none-any.whl", hash = "sha256:8961c3a78ebfcd000920c9060a262f082f29838682b1f7201889300c1fbe0616"},
- {file = "protobuf-3.19.4.tar.gz", hash = "sha256:9df0c10adf3e83015ced42a9a7bd64e13d06c4cf45c340d2c63020ea04499d0a"},
-]
-psutil = [
- {file = "psutil-5.9.1-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:799759d809c31aab5fe4579e50addf84565e71c1dc9f1c31258f159ff70d3f87"},
- {file = "psutil-5.9.1-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:9272167b5f5fbfe16945be3db475b3ce8d792386907e673a209da686176552af"},
- {file = "psutil-5.9.1-cp27-cp27m-win32.whl", hash = "sha256:0904727e0b0a038830b019551cf3204dd48ef5c6868adc776e06e93d615fc5fc"},
- {file = "psutil-5.9.1-cp27-cp27m-win_amd64.whl", hash = "sha256:e7e10454cb1ab62cc6ce776e1c135a64045a11ec4c6d254d3f7689c16eb3efd2"},
- {file = "psutil-5.9.1-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:56960b9e8edcca1456f8c86a196f0c3d8e3e361320071c93378d41445ffd28b0"},
- {file = "psutil-5.9.1-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:44d1826150d49ffd62035785a9e2c56afcea66e55b43b8b630d7706276e87f22"},
- {file = "psutil-5.9.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c7be9d7f5b0d206f0bbc3794b8e16fb7dbc53ec9e40bbe8787c6f2d38efcf6c9"},
- {file = "psutil-5.9.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:abd9246e4cdd5b554a2ddd97c157e292ac11ef3e7af25ac56b08b455c829dca8"},
- {file = "psutil-5.9.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:29a442e25fab1f4d05e2655bb1b8ab6887981838d22effa2396d584b740194de"},
- {file = "psutil-5.9.1-cp310-cp310-win32.whl", hash = "sha256:20b27771b077dcaa0de1de3ad52d22538fe101f9946d6dc7869e6f694f079329"},
- {file = "psutil-5.9.1-cp310-cp310-win_amd64.whl", hash = "sha256:58678bbadae12e0db55186dc58f2888839228ac9f41cc7848853539b70490021"},
- {file = "psutil-5.9.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:3a76ad658641172d9c6e593de6fe248ddde825b5866464c3b2ee26c35da9d237"},
- {file = "psutil-5.9.1-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a6a11e48cb93a5fa606306493f439b4aa7c56cb03fc9ace7f6bfa21aaf07c453"},
- {file = "psutil-5.9.1-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:068935df39055bf27a29824b95c801c7a5130f118b806eee663cad28dca97685"},
- {file = "psutil-5.9.1-cp36-cp36m-win32.whl", hash = "sha256:0f15a19a05f39a09327345bc279c1ba4a8cfb0172cc0d3c7f7d16c813b2e7d36"},
- {file = "psutil-5.9.1-cp36-cp36m-win_amd64.whl", hash = "sha256:db417f0865f90bdc07fa30e1aadc69b6f4cad7f86324b02aa842034efe8d8c4d"},
- {file = "psutil-5.9.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:91c7ff2a40c373d0cc9121d54bc5f31c4fa09c346528e6a08d1845bce5771ffc"},
- {file = "psutil-5.9.1-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fea896b54f3a4ae6f790ac1d017101252c93f6fe075d0e7571543510f11d2676"},
- {file = "psutil-5.9.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3054e923204b8e9c23a55b23b6df73a8089ae1d075cb0bf711d3e9da1724ded4"},
- {file = "psutil-5.9.1-cp37-cp37m-win32.whl", hash = "sha256:d2d006286fbcb60f0b391741f520862e9b69f4019b4d738a2a45728c7e952f1b"},
- {file = "psutil-5.9.1-cp37-cp37m-win_amd64.whl", hash = "sha256:b14ee12da9338f5e5b3a3ef7ca58b3cba30f5b66f7662159762932e6d0b8f680"},
- {file = "psutil-5.9.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:19f36c16012ba9cfc742604df189f2f28d2720e23ff7d1e81602dbe066be9fd1"},
- {file = "psutil-5.9.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:944c4b4b82dc4a1b805329c980f270f170fdc9945464223f2ec8e57563139cf4"},
- {file = "psutil-5.9.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b6750a73a9c4a4e689490ccb862d53c7b976a2a35c4e1846d049dcc3f17d83b"},
- {file = "psutil-5.9.1-cp38-cp38-win32.whl", hash = "sha256:a8746bfe4e8f659528c5c7e9af5090c5a7d252f32b2e859c584ef7d8efb1e689"},
- {file = "psutil-5.9.1-cp38-cp38-win_amd64.whl", hash = "sha256:79c9108d9aa7fa6fba6e668b61b82facc067a6b81517cab34d07a84aa89f3df0"},
- {file = "psutil-5.9.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:28976df6c64ddd6320d281128817f32c29b539a52bdae5e192537bc338a9ec81"},
- {file = "psutil-5.9.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b88f75005586131276634027f4219d06e0561292be8bd6bc7f2f00bdabd63c4e"},
- {file = "psutil-5.9.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:645bd4f7bb5b8633803e0b6746ff1628724668681a434482546887d22c7a9537"},
- {file = "psutil-5.9.1-cp39-cp39-win32.whl", hash = "sha256:32c52611756096ae91f5d1499fe6c53b86f4a9ada147ee42db4991ba1520e574"},
- {file = "psutil-5.9.1-cp39-cp39-win_amd64.whl", hash = "sha256:f65f9a46d984b8cd9b3750c2bdb419b2996895b005aefa6cbaba9a143b1ce2c5"},
- {file = "psutil-5.9.1.tar.gz", hash = "sha256:57f1819b5d9e95cdfb0c881a8a5b7d542ed0b7c522d575706a80bedc848c8954"},
-]
-ptyprocess = [
- {file = "ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35"},
- {file = "ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220"},
-]
+protobuf = []
py = [
{file = "py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"},
{file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"},
@@ -3569,56 +2658,20 @@ pyarrow = [
{file = "pyarrow-8.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:cb06cacc19f3b426681f2f6803cc06ff481e7fe5b3a533b406bc5b2138843d4f"},
{file = "pyarrow-8.0.0.tar.gz", hash = "sha256:4a18a211ed888f1ac0b0ebcb99e2d9a3e913a481120ee9b1fe33d3fedb945d4e"},
]
-pyasn1 = [
- {file = "pyasn1-0.4.8-py2.4.egg", hash = "sha256:fec3e9d8e36808a28efb59b489e4528c10ad0f480e57dcc32b4de5c9d8c9fdf3"},
- {file = "pyasn1-0.4.8-py2.5.egg", hash = "sha256:0458773cfe65b153891ac249bcf1b5f8f320b7c2ce462151f8fa74de8934becf"},
- {file = "pyasn1-0.4.8-py2.6.egg", hash = "sha256:5c9414dcfede6e441f7e8f81b43b34e834731003427e5b09e4e00e3172a10f00"},
- {file = "pyasn1-0.4.8-py2.7.egg", hash = "sha256:6e7545f1a61025a4e58bb336952c5061697da694db1cae97b116e9c46abcf7c8"},
- {file = "pyasn1-0.4.8-py2.py3-none-any.whl", hash = "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d"},
- {file = "pyasn1-0.4.8-py3.1.egg", hash = "sha256:78fa6da68ed2727915c4767bb386ab32cdba863caa7dbe473eaae45f9959da86"},
- {file = "pyasn1-0.4.8-py3.2.egg", hash = "sha256:08c3c53b75eaa48d71cf8c710312316392ed40899cb34710d092e96745a358b7"},
- {file = "pyasn1-0.4.8-py3.3.egg", hash = "sha256:03840c999ba71680a131cfaee6fab142e1ed9bbd9c693e285cc6aca0d555e576"},
- {file = "pyasn1-0.4.8-py3.4.egg", hash = "sha256:7ab8a544af125fb704feadb008c99a88805126fb525280b2270bb25cc1d78a12"},
- {file = "pyasn1-0.4.8-py3.5.egg", hash = "sha256:e89bf84b5437b532b0803ba5c9a5e054d21fec423a89952a74f87fa2c9b7bce2"},
- {file = "pyasn1-0.4.8-py3.6.egg", hash = "sha256:014c0e9976956a08139dc0712ae195324a75e142284d5f87f1a87ee1b068a359"},
- {file = "pyasn1-0.4.8-py3.7.egg", hash = "sha256:99fcc3c8d804d1bc6d9a099921e39d827026409a58f2a720dcdb89374ea0c776"},
- {file = "pyasn1-0.4.8.tar.gz", hash = "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba"},
-]
-pyasn1-modules = [
- {file = "pyasn1-modules-0.2.8.tar.gz", hash = "sha256:905f84c712230b2c592c19470d3ca8d552de726050d1d1716282a1f6146be65e"},
- {file = "pyasn1_modules-0.2.8-py2.4.egg", hash = "sha256:0fe1b68d1e486a1ed5473f1302bd991c1611d319bba158e98b106ff86e1d7199"},
- {file = "pyasn1_modules-0.2.8-py2.5.egg", hash = "sha256:fe0644d9ab041506b62782e92b06b8c68cca799e1a9636ec398675459e031405"},
- {file = "pyasn1_modules-0.2.8-py2.6.egg", hash = "sha256:a99324196732f53093a84c4369c996713eb8c89d360a496b599fb1a9c47fc3eb"},
- {file = "pyasn1_modules-0.2.8-py2.7.egg", hash = "sha256:0845a5582f6a02bb3e1bde9ecfc4bfcae6ec3210dd270522fee602365430c3f8"},
- {file = "pyasn1_modules-0.2.8-py2.py3-none-any.whl", hash = "sha256:a50b808ffeb97cb3601dd25981f6b016cbb3d31fbf57a8b8a87428e6158d0c74"},
- {file = "pyasn1_modules-0.2.8-py3.1.egg", hash = "sha256:f39edd8c4ecaa4556e989147ebf219227e2cd2e8a43c7e7fcb1f1c18c5fd6a3d"},
- {file = "pyasn1_modules-0.2.8-py3.2.egg", hash = "sha256:b80486a6c77252ea3a3e9b1e360bc9cf28eaac41263d173c032581ad2f20fe45"},
- {file = "pyasn1_modules-0.2.8-py3.3.egg", hash = "sha256:65cebbaffc913f4fe9e4808735c95ea22d7a7775646ab690518c056784bc21b4"},
- {file = "pyasn1_modules-0.2.8-py3.4.egg", hash = "sha256:15b7c67fabc7fc240d87fb9aabf999cf82311a6d6fb2c70d00d3d0604878c811"},
- {file = "pyasn1_modules-0.2.8-py3.5.egg", hash = "sha256:426edb7a5e8879f1ec54a1864f16b882c2837bfd06eee62f2c982315ee2473ed"},
- {file = "pyasn1_modules-0.2.8-py3.6.egg", hash = "sha256:cbac4bc38d117f2a49aeedec4407d23e8866ea4ac27ff2cf7fb3e5b570df19e0"},
- {file = "pyasn1_modules-0.2.8-py3.7.egg", hash = "sha256:c29a5e5cc7a3f05926aff34e097e84f8589cd790ce0ed41b67aed6857b26aafd"},
-]
-pycodestyle = [
- {file = "pycodestyle-2.7.0-py2.py3-none-any.whl", hash = "sha256:514f76d918fcc0b55c6680472f0a37970994e07bbb80725808c17089be302068"},
- {file = "pycodestyle-2.7.0.tar.gz", hash = "sha256:c389c1d06bf7904078ca03399a4816f974a1d590090fecea0c63ec26ebaf1cef"},
-]
+pyasn1 = []
+pyasn1-modules = []
+pycodestyle = []
pycparser = [
{file = "pycparser-2.21-py2.py3-none-any.whl", hash = "sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9"},
{file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
]
-pydeprecate = [
- {file = "pyDeprecate-0.3.2-py3-none-any.whl", hash = "sha256:ed86b68ed837e6465245904a3de2f59bf9eef78ac7a2502ee280533d04802457"},
- {file = "pyDeprecate-0.3.2.tar.gz", hash = "sha256:d481116cc5d7f6c473e7c4be820efdd9b90a16b594b350276e9e66a6cb5bdd29"},
-]
-pyflakes = [
- {file = "pyflakes-2.3.1-py2.py3-none-any.whl", hash = "sha256:7893783d01b8a89811dd72d7dfd4d84ff098e5eed95cfa8905b22bbffe52efc3"},
- {file = "pyflakes-2.3.1.tar.gz", hash = "sha256:f5bc8ecabc05bb9d291eb5203d6810b49040f6ff446a756326104746cc00c1db"},
-]
+pydeprecate = []
+pyflakes = []
pygments = [
{file = "Pygments-2.12.0-py3-none-any.whl", hash = "sha256:dc9c10fb40944260f6ed4c688ece0cd2048414940f1cea51b8b226318411c519"},
{file = "Pygments-2.12.0.tar.gz", hash = "sha256:5eb116118f9612ff1ee89ac96437bb6b49e8f04d8a13b514ba26f620208e26eb"},
]
+pymdown-extensions = []
pyparsing = [
{file = "pyparsing-3.0.9-py3-none-any.whl", hash = "sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc"},
{file = "pyparsing-3.0.9.tar.gz", hash = "sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb"},
@@ -3646,26 +2699,15 @@ pyrsistent = [
{file = "pyrsistent-0.18.1-cp39-cp39-win_amd64.whl", hash = "sha256:e24a828f57e0c337c8d8bb9f6b12f09dfdf0273da25fda9e314f0b684b415a07"},
{file = "pyrsistent-0.18.1.tar.gz", hash = "sha256:d4d61f8b993a7255ba714df3aca52700f8125289f84f704cf80916517c46eb96"},
]
-pytest = [
- {file = "pytest-6.2.5-py3-none-any.whl", hash = "sha256:7310f8d27bc79ced999e760ca304d69f6ba6c6649c0b60fb0e04a4a77cacc134"},
- {file = "pytest-6.2.5.tar.gz", hash = "sha256:131b36680866a76e6781d13f101efb86cf674ebb9762eb70d3082b6f29889e89"},
-]
-pytest-cov = [
- {file = "pytest-cov-2.12.1.tar.gz", hash = "sha256:261ceeb8c227b726249b376b8526b600f38667ee314f910353fa318caa01f4d7"},
- {file = "pytest_cov-2.12.1-py2.py3-none-any.whl", hash = "sha256:261bb9e47e65bd099c89c3edf92972865210c36813f80ede5277dceb77a4a62a"},
-]
-pytest-mock = [
- {file = "pytest-mock-3.8.1.tar.gz", hash = "sha256:2c6d756d5d3bf98e2e80797a959ca7f81f479e7d1f5f571611b0fdd6d1745240"},
- {file = "pytest_mock-3.8.1-py3-none-any.whl", hash = "sha256:d989f11ca4a84479e288b0cd1e6769d6ad0d3d7743dcc75e460d1416a5f2135a"},
-]
+pytest = []
+pytest-cov = []
+pytest-mock = []
python-dateutil = [
{file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
{file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
]
-pytorch-lightning = [
- {file = "pytorch-lightning-1.6.4.tar.gz", hash = "sha256:5459f2c3e67676ec59e94576d1499e9559d214e7df41eadd135db64b4ccf54b9"},
- {file = "pytorch_lightning-1.6.4-py3-none-any.whl", hash = "sha256:0f42f93116a3fcb6fd8c9ea45cf7c918e4aa3f848ae21d0e9ac2bf39f2865dd7"},
-]
+pytkdocs = []
+pytorch-lightning = []
pytz = [
{file = "pytz-2022.1-py2.py3-none-any.whl", hash = "sha256:e68985985296d9a66a881eb3193b0906246245294a881e7c8afe623866ac6a5c"},
{file = "pytz-2022.1.tar.gz", hash = "sha256:1e760e2fe6a8163bc0b3d9a19c4f84342afa0a2affebfaa84b01b978a02ecaa7"},
@@ -3686,48 +2728,8 @@ pywin32 = [
{file = "pywin32-304-cp39-cp39-win32.whl", hash = "sha256:25746d841201fd9f96b648a248f731c1dec851c9a08b8e33da8b56148e4c65cc"},
{file = "pywin32-304-cp39-cp39-win_amd64.whl", hash = "sha256:d24a3382f013b21aa24a5cfbfad5a2cd9926610c0affde3e8ab5b3d7dbcf4ac9"},
]
-pywinpty = [
- {file = "pywinpty-2.0.5-cp310-none-win_amd64.whl", hash = "sha256:f86c76e2881c37e69678cbbf178109f8da1fa8584db24d58e1b9369b0276cfcb"},
- {file = "pywinpty-2.0.5-cp37-none-win_amd64.whl", hash = "sha256:ff9b52f182650cfdf3db1b264a6fe0963eb9d996a7a1fa843ac406c1e32111f8"},
- {file = "pywinpty-2.0.5-cp38-none-win_amd64.whl", hash = "sha256:651ee1467bd7eb6f64d44dbc954b7ab7d15ab6d8adacc4e13299692c67c5d5d2"},
- {file = "pywinpty-2.0.5-cp39-none-win_amd64.whl", hash = "sha256:e59a508ae78374febada3e53b5bbc90b5ad07ae68cbfd72a2e965f9793ae04f3"},
- {file = "pywinpty-2.0.5.tar.gz", hash = "sha256:e125d3f1804d8804952b13e33604ad2ca8b9b2cac92b27b521c005d1604794f8"},
-]
-pyyaml = [
- {file = "PyYAML-6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4db7c7aef085872ef65a8fd7d6d09a14ae91f691dec3e87ee5ee0539d516f53"},
- {file = "PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9df7ed3b3d2e0ecfe09e14741b857df43adb5a3ddadc919a2d94fbdf78fea53c"},
- {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77f396e6ef4c73fdc33a9157446466f1cff553d979bd00ecb64385760c6babdc"},
- {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a80a78046a72361de73f8f395f1f1e49f956c6be882eed58505a15f3e430962b"},
- {file = "PyYAML-6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f84fbc98b019fef2ee9a1cb3ce93e3187a6df0b2538a651bfb890254ba9f90b5"},
- {file = "PyYAML-6.0-cp310-cp310-win32.whl", hash = "sha256:2cd5df3de48857ed0544b34e2d40e9fac445930039f3cfe4bcc592a1f836d513"},
- {file = "PyYAML-6.0-cp310-cp310-win_amd64.whl", hash = "sha256:daf496c58a8c52083df09b80c860005194014c3698698d1a57cbcfa182142a3a"},
- {file = "PyYAML-6.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:897b80890765f037df3403d22bab41627ca8811ae55e9a722fd0392850ec4d86"},
- {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50602afada6d6cbfad699b0c7bb50d5ccffa7e46a3d738092afddc1f9758427f"},
- {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:48c346915c114f5fdb3ead70312bd042a953a8ce5c7106d5bfb1a5254e47da92"},
- {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:98c4d36e99714e55cfbaaee6dd5badbc9a1ec339ebfc3b1f52e293aee6bb71a4"},
- {file = "PyYAML-6.0-cp36-cp36m-win32.whl", hash = "sha256:0283c35a6a9fbf047493e3a0ce8d79ef5030852c51e9d911a27badfde0605293"},
- {file = "PyYAML-6.0-cp36-cp36m-win_amd64.whl", hash = "sha256:07751360502caac1c067a8132d150cf3d61339af5691fe9e87803040dbc5db57"},
- {file = "PyYAML-6.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:819b3830a1543db06c4d4b865e70ded25be52a2e0631ccd2f6a47a2822f2fd7c"},
- {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:473f9edb243cb1935ab5a084eb238d842fb8f404ed2193a915d1784b5a6b5fc0"},
- {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0ce82d761c532fe4ec3f87fc45688bdd3a4c1dc5e0b4a19814b9009a29baefd4"},
- {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:231710d57adfd809ef5d34183b8ed1eeae3f76459c18fb4a0b373ad56bedcdd9"},
- {file = "PyYAML-6.0-cp37-cp37m-win32.whl", hash = "sha256:c5687b8d43cf58545ade1fe3e055f70eac7a5a1a0bf42824308d868289a95737"},
- {file = "PyYAML-6.0-cp37-cp37m-win_amd64.whl", hash = "sha256:d15a181d1ecd0d4270dc32edb46f7cb7733c7c508857278d3d378d14d606db2d"},
- {file = "PyYAML-6.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0b4624f379dab24d3725ffde76559cff63d9ec94e1736b556dacdfebe5ab6d4b"},
- {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:213c60cd50106436cc818accf5baa1aba61c0189ff610f64f4a3e8c6726218ba"},
- {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9fa600030013c4de8165339db93d182b9431076eb98eb40ee068700c9c813e34"},
- {file = "PyYAML-6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:277a0ef2981ca40581a47093e9e2d13b3f1fbbeffae064c1d21bfceba2030287"},
- {file = "PyYAML-6.0-cp38-cp38-win32.whl", hash = "sha256:d4eccecf9adf6fbcc6861a38015c2a64f38b9d94838ac1810a9023a0609e1b78"},
- {file = "PyYAML-6.0-cp38-cp38-win_amd64.whl", hash = "sha256:1e4747bc279b4f613a09eb64bba2ba602d8a6664c6ce6396a4d0cd413a50ce07"},
- {file = "PyYAML-6.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:055d937d65826939cb044fc8c9b08889e8c743fdc6a32b33e2390f66013e449b"},
- {file = "PyYAML-6.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e61ceaab6f49fb8bdfaa0f92c4b57bcfbea54c09277b1b4f7ac376bfb7a7c174"},
- {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d67d839ede4ed1b28a4e8909735fc992a923cdb84e618544973d7dfc71540803"},
- {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cba8c411ef271aa037d7357a2bc8f9ee8b58b9965831d9e51baf703280dc73d3"},
- {file = "PyYAML-6.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:40527857252b61eacd1d9af500c3337ba8deb8fc298940291486c465c8b46ec0"},
- {file = "PyYAML-6.0-cp39-cp39-win32.whl", hash = "sha256:b5b9eccad747aabaaffbc6064800670f0c297e52c12754eb1d976c57e4f74dcb"},
- {file = "PyYAML-6.0-cp39-cp39-win_amd64.whl", hash = "sha256:b3d267842bf12586ba6c734f89d1f5b871df0273157918b0ccefa29deb05c21c"},
- {file = "PyYAML-6.0.tar.gz", hash = "sha256:68fb519c14306fec9720a2a5b45bc9f0c8d1b9c72adf45c37baedfcd949c35a2"},
-]
+pyyaml = []
+pyyaml-env-tag = []
pyzmq = [
{file = "pyzmq-23.2.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:22ac0243a41798e3eb5d5714b28c2f28e3d10792dffbc8a5fca092f975fdeceb"},
{file = "pyzmq-23.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f685003d836ad0e5d4f08d1e024ee3ac7816eb2f873b2266306eef858f058133"},
@@ -3787,10 +2789,6 @@ pyzmq = [
{file = "pyzmq-23.2.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:ce4f71e17fa849de41a06109030d3f6815fcc33338bf98dd0dde6d456d33c929"},
{file = "pyzmq-23.2.0.tar.gz", hash = "sha256:a51f12a8719aad9dcfb55d456022f16b90abc8dde7d3ca93ce3120b40e3fa169"},
]
-recommonmark = [
- {file = "recommonmark-0.7.1-py2.py3-none-any.whl", hash = "sha256:1b1db69af0231efce3fa21b94ff627ea33dee7079a01dd0a7f8482c3da148b3f"},
- {file = "recommonmark-0.7.1.tar.gz", hash = "sha256:bdb4db649f2222dcd8d2d844f0006b958d627f732415d399791ee436a3686d67"},
-]
regex = [
{file = "regex-2022.6.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:042d122f9fee3ceb6d7e3067d56557df697d1aad4ff5f64ecce4dc13a90a7c01"},
{file = "regex-2022.6.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ffef4b30785dc2d1604dfb7cf9fca5dc27cd86d65f7c2a9ec34d6d3ae4565ec2"},
@@ -3867,174 +2865,30 @@ regex = [
{file = "regex-2022.6.2-cp39-cp39-win_amd64.whl", hash = "sha256:3b9b6289e03dbe6a6096880d8ac166cb23c38b4896ad235edee789d4e8697152"},
{file = "regex-2022.6.2.tar.gz", hash = "sha256:f7b43acb2c46fb2cd506965b2d9cf4c5e64c9c612bac26c1187933c7296bf08c"},
]
-requests = [
- {file = "requests-2.28.1-py3-none-any.whl", hash = "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349"},
- {file = "requests-2.28.1.tar.gz", hash = "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983"},
-]
-requests-oauthlib = [
- {file = "requests-oauthlib-1.3.1.tar.gz", hash = "sha256:75beac4a47881eeb94d5ea5d6ad31ef88856affe2332b9aafb52c6452ccf0d7a"},
- {file = "requests_oauthlib-1.3.1-py2.py3-none-any.whl", hash = "sha256:2577c501a2fb8d05a304c09d090d6e47c306fef15809d102b327cf8364bddab5"},
-]
+requests = []
+requests-oauthlib = []
responses = [
{file = "responses-0.18.0-py3-none-any.whl", hash = "sha256:15c63ad16de13ee8e7182d99c9334f64fd81f1ee79f90748d527c28f7ca9dd51"},
{file = "responses-0.18.0.tar.gz", hash = "sha256:380cad4c1c1dc942e5e8a8eaae0b4d4edf708f4f010db8b7bcfafad1fcd254ff"},
]
-rsa = [
- {file = "rsa-4.8-py3-none-any.whl", hash = "sha256:95c5d300c4e879ee69708c428ba566c59478fd653cc3a22243eeb8ed846950bb"},
- {file = "rsa-4.8.tar.gz", hash = "sha256:5c6bd9dc7a543b7fe4304a631f8a8a3b674e2bbfc49c2ae96200cdbe55df6b17"},
-]
-scikit-learn = [
- {file = "scikit-learn-1.0.2.tar.gz", hash = "sha256:b5870959a5484b614f26d31ca4c17524b1b0317522199dc985c3b4256e030767"},
- {file = "scikit_learn-1.0.2-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:da3c84694ff693b5b3194d8752ccf935a665b8b5edc33a283122f4273ca3e687"},
- {file = "scikit_learn-1.0.2-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:75307d9ea39236cad7eea87143155eea24d48f93f3a2f9389c817f7019f00705"},
- {file = "scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f14517e174bd7332f1cca2c959e704696a5e0ba246eb8763e6c24876d8710049"},
- {file = "scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d9aac97e57c196206179f674f09bc6bffcd0284e2ba95b7fe0b402ac3f986023"},
- {file = "scikit_learn-1.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:d93d4c28370aea8a7cbf6015e8a669cd5d69f856cc2aa44e7a590fb805bb5583"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-macosx_10_13_x86_64.whl", hash = "sha256:85260fb430b795d806251dd3bb05e6f48cdc777ac31f2bcf2bc8bbed3270a8f5"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:a053a6a527c87c5c4fa7bf1ab2556fa16d8345cf99b6c5a19030a4a7cd8fd2c0"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:245c9b5a67445f6f044411e16a93a554edc1efdcce94d3fc0bc6a4b9ac30b752"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:158faf30684c92a78e12da19c73feff9641a928a8024b4fa5ec11d583f3d8a87"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:08ef968f6b72033c16c479c966bf37ccd49b06ea91b765e1cc27afefe723920b"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:16455ace947d8d9e5391435c2977178d0ff03a261571e67f627c8fee0f9d431a"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-win32.whl", hash = "sha256:2f3b453e0b149898577e301d27e098dfe1a36943f7bb0ad704d1e548efc3b448"},
- {file = "scikit_learn-1.0.2-cp37-cp37m-win_amd64.whl", hash = "sha256:46f431ec59dead665e1370314dbebc99ead05e1c0a9df42f22d6a0e00044820f"},
- {file = "scikit_learn-1.0.2-cp38-cp38-macosx_10_13_x86_64.whl", hash = "sha256:ff3fa8ea0e09e38677762afc6e14cad77b5e125b0ea70c9bba1992f02c93b028"},
- {file = "scikit_learn-1.0.2-cp38-cp38-macosx_12_0_arm64.whl", hash = "sha256:9369b030e155f8188743eb4893ac17a27f81d28a884af460870c7c072f114243"},
- {file = "scikit_learn-1.0.2-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:7d6b2475f1c23a698b48515217eb26b45a6598c7b1840ba23b3c5acece658dbb"},
- {file = "scikit_learn-1.0.2-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:285db0352e635b9e3392b0b426bc48c3b485512d3b4ac3c7a44ec2a2ba061e66"},
- {file = "scikit_learn-1.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5cb33fe1dc6f73dc19e67b264dbb5dde2a0539b986435fdd78ed978c14654830"},
- {file = "scikit_learn-1.0.2-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b1391d1a6e2268485a63c3073111fe3ba6ec5145fc957481cfd0652be571226d"},
- {file = "scikit_learn-1.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc3744dabc56b50bec73624aeca02e0def06b03cb287de26836e730659c5d29c"},
- {file = "scikit_learn-1.0.2-cp38-cp38-win32.whl", hash = "sha256:a999c9f02ff9570c783069f1074f06fe7386ec65b84c983db5aeb8144356a355"},
- {file = "scikit_learn-1.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:7626a34eabbf370a638f32d1a3ad50526844ba58d63e3ab81ba91e2a7c6d037e"},
- {file = "scikit_learn-1.0.2-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:a90b60048f9ffdd962d2ad2fb16367a87ac34d76e02550968719eb7b5716fd10"},
- {file = "scikit_learn-1.0.2-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:7a93c1292799620df90348800d5ac06f3794c1316ca247525fa31169f6d25855"},
- {file = "scikit_learn-1.0.2-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:eabceab574f471de0b0eb3f2ecf2eee9f10b3106570481d007ed1c84ebf6d6a1"},
- {file = "scikit_learn-1.0.2-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:55f2f3a8414e14fbee03782f9fe16cca0f141d639d2b1c1a36779fa069e1db57"},
- {file = "scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:80095a1e4b93bd33261ef03b9bc86d6db649f988ea4dbcf7110d0cded8d7213d"},
- {file = "scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fa38a1b9b38ae1fad2863eff5e0d69608567453fdfc850c992e6e47eb764e846"},
- {file = "scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff746a69ff2ef25f62b36338c615dd15954ddc3ab8e73530237dd73235e76d62"},
- {file = "scikit_learn-1.0.2-cp39-cp39-win32.whl", hash = "sha256:e174242caecb11e4abf169342641778f68e1bfaba80cd18acd6bc84286b9a534"},
- {file = "scikit_learn-1.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:b54a62c6e318ddbfa7d22c383466d38d2ee770ebdb5ddb668d56a099f6eaf75f"},
-]
-scipy = [
- {file = "scipy-1.7.3-1-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:c9e04d7e9b03a8a6ac2045f7c5ef741be86727d8f49c45db45f244bdd2bcff17"},
- {file = "scipy-1.7.3-1-cp38-cp38-macosx_12_0_arm64.whl", hash = "sha256:b0e0aeb061a1d7dcd2ed59ea57ee56c9b23dd60100825f98238c06ee5cc4467e"},
- {file = "scipy-1.7.3-1-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:b78a35c5c74d336f42f44106174b9851c783184a85a3fe3e68857259b37b9ffb"},
- {file = "scipy-1.7.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:173308efba2270dcd61cd45a30dfded6ec0085b4b6eb33b5eb11ab443005e088"},
- {file = "scipy-1.7.3-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:21b66200cf44b1c3e86495e3a436fc7a26608f92b8d43d344457c54f1c024cbc"},
- {file = "scipy-1.7.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ceebc3c4f6a109777c0053dfa0282fddb8893eddfb0d598574acfb734a926168"},
- {file = "scipy-1.7.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f7eaea089345a35130bc9a39b89ec1ff69c208efa97b3f8b25ea5d4c41d88094"},
- {file = "scipy-1.7.3-cp310-cp310-win_amd64.whl", hash = "sha256:304dfaa7146cffdb75fbf6bb7c190fd7688795389ad060b970269c8576d038e9"},
- {file = "scipy-1.7.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:033ce76ed4e9f62923e1f8124f7e2b0800db533828c853b402c7eec6e9465d80"},
- {file = "scipy-1.7.3-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:4d242d13206ca4302d83d8a6388c9dfce49fc48fdd3c20efad89ba12f785bf9e"},
- {file = "scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8499d9dd1459dc0d0fe68db0832c3d5fc1361ae8e13d05e6849b358dc3f2c279"},
- {file = "scipy-1.7.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca36e7d9430f7481fc7d11e015ae16fbd5575615a8e9060538104778be84addf"},
- {file = "scipy-1.7.3-cp37-cp37m-win32.whl", hash = "sha256:e2c036492e673aad1b7b0d0ccdc0cb30a968353d2c4bf92ac8e73509e1bf212c"},
- {file = "scipy-1.7.3-cp37-cp37m-win_amd64.whl", hash = "sha256:866ada14a95b083dd727a845a764cf95dd13ba3dc69a16b99038001b05439709"},
- {file = "scipy-1.7.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:65bd52bf55f9a1071398557394203d881384d27b9c2cad7df9a027170aeaef93"},
- {file = "scipy-1.7.3-cp38-cp38-macosx_12_0_arm64.whl", hash = "sha256:f99d206db1f1ae735a8192ab93bd6028f3a42f6fa08467d37a14eb96c9dd34a3"},
- {file = "scipy-1.7.3-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:5f2cfc359379c56b3a41b17ebd024109b2049f878badc1e454f31418c3a18436"},
- {file = "scipy-1.7.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eb7ae2c4dbdb3c9247e07acc532f91077ae6dbc40ad5bd5dca0bb5a176ee9bda"},
- {file = "scipy-1.7.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95c2d250074cfa76715d58830579c64dff7354484b284c2b8b87e5a38321672c"},
- {file = "scipy-1.7.3-cp38-cp38-win32.whl", hash = "sha256:87069cf875f0262a6e3187ab0f419f5b4280d3dcf4811ef9613c605f6e4dca95"},
- {file = "scipy-1.7.3-cp38-cp38-win_amd64.whl", hash = "sha256:7edd9a311299a61e9919ea4192dd477395b50c014cdc1a1ac572d7c27e2207fa"},
- {file = "scipy-1.7.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eef93a446114ac0193a7b714ce67659db80caf940f3232bad63f4c7a81bc18df"},
- {file = "scipy-1.7.3-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:eb326658f9b73c07081300daba90a8746543b5ea177184daed26528273157294"},
- {file = "scipy-1.7.3-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:93378f3d14fff07572392ce6a6a2ceb3a1f237733bd6dcb9eb6a2b29b0d19085"},
- {file = "scipy-1.7.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:edad1cf5b2ce1912c4d8ddad20e11d333165552aba262c882e28c78bbc09dbf6"},
- {file = "scipy-1.7.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5d1cc2c19afe3b5a546ede7e6a44ce1ff52e443d12b231823268019f608b9b12"},
- {file = "scipy-1.7.3-cp39-cp39-win32.whl", hash = "sha256:2c56b820d304dffcadbbb6cbfbc2e2c79ee46ea291db17e288e73cd3c64fefa9"},
- {file = "scipy-1.7.3-cp39-cp39-win_amd64.whl", hash = "sha256:3f78181a153fa21c018d346f595edd648344751d7f03ab94b398be2ad083ed3e"},
- {file = "scipy-1.7.3.tar.gz", hash = "sha256:ab5875facfdef77e0a47d5fd39ea178b58e60e454a4c85aa1e52fcb80db7babf"},
-]
-send2trash = [
- {file = "Send2Trash-1.8.0-py3-none-any.whl", hash = "sha256:f20eaadfdb517eaca5ce077640cb261c7d2698385a6a0f072a4a5447fd49fa08"},
- {file = "Send2Trash-1.8.0.tar.gz", hash = "sha256:d2c24762fd3759860a0aff155e45871447ea58d2be6bdd39b5c8f966a0c99c2d"},
-]
-setuptools-scm = [
- {file = "setuptools_scm-7.0.4-py3-none-any.whl", hash = "sha256:53a6f51451a84d891ca485cec700a802413bbc5e76ee65da134e54c733a6e44d"},
- {file = "setuptools_scm-7.0.4.tar.gz", hash = "sha256:c27bc1f48593cfc9527251f1f0fc41ce282ea57bbc7fd5a1ea3acb99325fab4c"},
-]
+rsa = []
+scikit-learn = []
+scipy = []
+setuptools-scm = []
six = [
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
]
-smmap = [
- {file = "smmap-5.0.0-py3-none-any.whl", hash = "sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94"},
- {file = "smmap-5.0.0.tar.gz", hash = "sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936"},
-]
-snowballstemmer = [
- {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"},
- {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
-]
+smmap = []
soupsieve = [
{file = "soupsieve-2.3.2.post1-py3-none-any.whl", hash = "sha256:3b2503d3c7084a42b1ebd08116e5f81aadfaea95863628c80a3b774a11b7c759"},
{file = "soupsieve-2.3.2.post1.tar.gz", hash = "sha256:fc53893b3da2c33de295667a0e19f078c14bf86544af307354de5fcf12a3f30d"},
]
-sphinx = [
- {file = "Sphinx-5.0.2-py3-none-any.whl", hash = "sha256:d3e57663eed1d7c5c50895d191fdeda0b54ded6f44d5621b50709466c338d1e8"},
- {file = "Sphinx-5.0.2.tar.gz", hash = "sha256:b18e978ea7565720f26019c702cd85c84376e948370f1cd43d60265010e1c7b0"},
-]
-sphinx-automodapi = [
- {file = "sphinx-automodapi-0.13.tar.gz", hash = "sha256:e1019336df7f7f0bcbf848eff7b84e7bef71691a57d8b5bda9107a2a046a226a"},
- {file = "sphinx_automodapi-0.13-py3-none-any.whl", hash = "sha256:f9ebc9c10597f3aab1d93e5a8b1829903eee7c64f5bafb0cf71fd40e5c7d95f0"},
-]
-sphinx-copybutton = [
- {file = "sphinx-copybutton-0.4.0.tar.gz", hash = "sha256:8daed13a87afd5013c3a9af3575cc4d5bec052075ccd3db243f895c07a689386"},
- {file = "sphinx_copybutton-0.4.0-py3-none-any.whl", hash = "sha256:4340d33c169dac6dd82dce2c83333412aa786a42dd01a81a8decac3b130dc8b0"},
-]
-sphinx-rtd-theme = [
- {file = "sphinx_rtd_theme-0.5.2-py2.py3-none-any.whl", hash = "sha256:4a05bdbe8b1446d77a01e20a23ebc6777c74f43237035e76be89699308987d6f"},
- {file = "sphinx_rtd_theme-0.5.2.tar.gz", hash = "sha256:32bd3b5d13dc8186d7a42fc816a23d32e83a4827d7d9882948e7b837c232da5a"},
-]
-sphinxcontrib-applehelp = [
- {file = "sphinxcontrib-applehelp-1.0.2.tar.gz", hash = "sha256:a072735ec80e7675e3f432fcae8610ecf509c5f1869d17e2eecff44389cdbc58"},
- {file = "sphinxcontrib_applehelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:806111e5e962be97c29ec4c1e7fe277bfd19e9652fb1a4392105b43e01af885a"},
-]
-sphinxcontrib-devhelp = [
- {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"},
- {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"},
-]
-sphinxcontrib-htmlhelp = [
- {file = "sphinxcontrib-htmlhelp-2.0.0.tar.gz", hash = "sha256:f5f8bb2d0d629f398bf47d0d69c07bc13b65f75a81ad9e2f71a63d4b7a2f6db2"},
- {file = "sphinxcontrib_htmlhelp-2.0.0-py2.py3-none-any.whl", hash = "sha256:d412243dfb797ae3ec2b59eca0e52dac12e75a241bf0e4eb861e450d06c6ed07"},
-]
-sphinxcontrib-jsmath = [
- {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
- {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
-]
-sphinxcontrib-qthelp = [
- {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"},
- {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"},
-]
-sphinxcontrib-serializinghtml = [
- {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"},
- {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
-]
-stevedore = [
- {file = "stevedore-3.5.0-py3-none-any.whl", hash = "sha256:a547de73308fd7e90075bb4d301405bebf705292fa90a90fc3bcf9133f58616c"},
- {file = "stevedore-3.5.0.tar.gz", hash = "sha256:f40253887d8712eaa2bb0ea3830374416736dc8ec0e22f5a65092c1174c44335"},
-]
-structlog = [
- {file = "structlog-21.5.0-py3-none-any.whl", hash = "sha256:fd7922e195262b337da85c2a91c84be94ccab1f8fd1957bd6986f6904e3761c8"},
- {file = "structlog-21.5.0.tar.gz", hash = "sha256:68c4c29c003714fe86834f347cb107452847ba52414390a7ee583472bde00fc9"},
-]
-tensorboard = [
- {file = "tensorboard-2.9.1-py3-none-any.whl", hash = "sha256:baa727f791776f9e5841d347127720ceed4bbd59c36b40604b95fb2ae6029276"},
-]
-tensorboard-data-server = [
- {file = "tensorboard_data_server-0.6.1-py3-none-any.whl", hash = "sha256:809fe9887682d35c1f7d1f54f0f40f98bb1f771b14265b453ca051e2ce58fca7"},
- {file = "tensorboard_data_server-0.6.1-py3-none-macosx_10_9_x86_64.whl", hash = "sha256:fa8cef9be4fcae2f2363c88176638baf2da19c5ec90addb49b1cde05c95c88ee"},
- {file = "tensorboard_data_server-0.6.1-py3-none-manylinux2010_x86_64.whl", hash = "sha256:d8237580755e58eff68d1f3abefb5b1e39ae5c8b127cc40920f9c4fb33f4b98a"},
-]
-tensorboard-plugin-wit = [
- {file = "tensorboard_plugin_wit-1.8.1-py3-none-any.whl", hash = "sha256:ff26bdd583d155aa951ee3b152b3d0cffae8005dc697f72b44a8e8c2a77a8cbe"},
-]
-terminado = [
- {file = "terminado-0.15.0-py3-none-any.whl", hash = "sha256:0d5f126fbfdb5887b25ae7d9d07b0d716b1cc0ccaacc71c1f3c14d228e065197"},
- {file = "terminado-0.15.0.tar.gz", hash = "sha256:ab4eeedccfcc1e6134bfee86106af90852c69d602884ea3a1e8ca6d4486e9bfe"},
-]
+stevedore = []
+structlog = []
+tensorboard = []
+tensorboard-data-server = []
+tensorboard-plugin-wit = []
threadpoolctl = [
{file = "threadpoolctl-3.1.0-py3-none-any.whl", hash = "sha256:8b99adda265feb6773280df41eece7b2e6561b772d21ffd52e372f999024907b"},
{file = "threadpoolctl-3.1.0.tar.gz", hash = "sha256:a335baacfaa4400ae1f0d8e3a58d6674d2f8828e3716bb2802c44955ad391380"},
@@ -4078,14 +2932,8 @@ tokenizers = [
{file = "tokenizers-0.12.1-cp39-cp39-win_amd64.whl", hash = "sha256:2158baf80cbc09259bfd6e0e0fc4597b611e7a72ad5443dad63918a90f1dd304"},
{file = "tokenizers-0.12.1.tar.gz", hash = "sha256:070746f86efa6c873db341e55cf17bb5e7bdd5450330ca8eca542f5c3dab2c66"},
]
-toml = [
- {file = "toml-0.10.2-py2.py3-none-any.whl", hash = "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b"},
- {file = "toml-0.10.2.tar.gz", hash = "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"},
-]
-tomli = [
- {file = "tomli-1.2.3-py3-none-any.whl", hash = "sha256:e3069e4be3ead9668e21cb9b074cd948f7b3113fd9c8bba083f48247aab8b11c"},
- {file = "tomli-1.2.3.tar.gz", hash = "sha256:05b6166bff487dc068d322585c7ea4ef78deed501cc124060e0f238e89a9231f"},
-]
+toml = []
+tomli = []
torch = [
{file = "torch-1.12.0-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:3322d33a06e440d715bb214334bd41314c94632d9a2f07d22006bf21da3a2be4"},
{file = "torch-1.12.0-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:2568f011dddeb5990d8698cc375d237f14568ffa8489854e3b94113b4b6b7c8b"},
@@ -4108,35 +2956,12 @@ torch = [
{file = "torch-1.12.0-cp39-none-macosx_10_9_x86_64.whl", hash = "sha256:c0313438bc36448ffd209f5fb4e5f325b3af158cdf61c8829b8ddaf128c57816"},
{file = "torch-1.12.0-cp39-none-macosx_11_0_arm64.whl", hash = "sha256:5ed69d5af232c5c3287d44cef998880dadcc9721cd020e9ae02f42e56b79c2e4"},
]
-torch-hypothesis = [
- {file = "torch-hypothesis-0.2.0.tar.gz", hash = "sha256:eb6d1de384c78cfa6d55050d0e08626acbfb9defe6c05c3efe6bf8a83ecdec7c"},
- {file = "torch_hypothesis-0.2.0-py3-none-any.whl", hash = "sha256:b6f8ebc75080659668aaf60a2c6aa32276d8690cf4b97d4468228bd976dc1a71"},
-]
+torch-hypothesis = []
torchmetrics = [
{file = "torchmetrics-0.9.2-py3-none-any.whl", hash = "sha256:ced006295c95c4555df0b8dea92960c00e3303de0da878fcf27e394df4757827"},
{file = "torchmetrics-0.9.2.tar.gz", hash = "sha256:8178c9242e243318093d9b7237738a504535193d2006da6e58b0ed4003e318d2"},
]
-torchvision = [
- {file = "torchvision-0.13.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:61d5093a50b7923a4e5bf9e0271001c29e01abec2348b7dd93370a0a9d15836c"},
- {file = "torchvision-0.13.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6c4c35428c758adc485ff8f239b5ed68c1b6c26efa261a52e431cab0f7f22aec"},
- {file = "torchvision-0.13.0-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:acb72a40e5dc0cd454d28514dbdd589a5057afd9bb5c785b87a54718b999bfa1"},
- {file = "torchvision-0.13.0-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:df16abf31e7a5fce8db1f781bf1e4f20c8bc730c7c3f657e946cc5820c04e465"},
- {file = "torchvision-0.13.0-cp310-cp310-win_amd64.whl", hash = "sha256:01e9e7b2e7724e66561e8d98f900985d80191e977c5c0b3f33ed31800ba0210c"},
- {file = "torchvision-0.13.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5c31e9b3004142dbfdf32adc4cf2d4fd709b820833e9786f839ae3a91ff65ef0"},
- {file = "torchvision-0.13.0-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:a20662c11dc14fd4eff102ceb946a7ee80b9f98303bb52435cc903f2c4c1fe10"},
- {file = "torchvision-0.13.0-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:ada295dbfe55017b02acfab960a997387f5addbadd28ee5e575e24f692992ce4"},
- {file = "torchvision-0.13.0-cp37-cp37m-win_amd64.whl", hash = "sha256:ad458146aca15f652f9b0c227bebd5403602c7341f15f68f20ec119fa8e8f4a5"},
- {file = "torchvision-0.13.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:83a4d9d50787d1e886c94486b63b15978391f6cf1892fce6a93132c09b14e128"},
- {file = "torchvision-0.13.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:42d95ab197d090efc5669fec02fbc603d05c859e50ca2c60180d1a113aa9b3e2"},
- {file = "torchvision-0.13.0-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:1b703701f0b99f307ad925b1abda2b3d5bdbf30643ff02102b6aeeb8840ae278"},
- {file = "torchvision-0.13.0-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:667cac55afb13cda7d362466e7eba3119e529b210e55507d231bead09aca5e1f"},
- {file = "torchvision-0.13.0-cp38-cp38-win_amd64.whl", hash = "sha256:1e2049f1207631d42d743205f663f1d2235796565be3f18b0339d479626faf30"},
- {file = "torchvision-0.13.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c2278a189663087bb8e65915062aa7a25b8f8e5a3cfaa5879fe277e23e4bbf40"},
- {file = "torchvision-0.13.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:253eb0c67bf88cef4a79ec69058c3e94f9fde28b9e3699ad1afc0b3ed50f8075"},
- {file = "torchvision-0.13.0-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:0e28740bd5695076f7c449af650fc474d6566722d446461c2ceebf9c9599b37f"},
- {file = "torchvision-0.13.0-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:b620a43df4131ad09f5761c415a016a9ea95aaf8ec8c91d030fb59bad591094a"},
- {file = "torchvision-0.13.0-cp39-cp39-win_amd64.whl", hash = "sha256:b7a2c9aebc7ef265777fe7e82577364288d98cf6b8cf0a63bb2621df78a7af1a"},
-]
+torchvision = []
tornado = [
{file = "tornado-6.1-cp35-cp35m-macosx_10_9_x86_64.whl", hash = "sha256:d371e811d6b156d82aa5f9a4e08b58debf97c302a35714f6f45e35139c332e32"},
{file = "tornado-6.1-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:0d321a39c36e5f2c4ff12b4ed58d41390460f798422c4504e09eb5678e09998c"},
@@ -4192,62 +3017,18 @@ transformers = [
{file = "transformers-4.20.1-py3-none-any.whl", hash = "sha256:d284eaf60b10fee45b24688423b5f7ba2d194f8c2dadf8df76cd58c1a9d08b52"},
{file = "transformers-4.20.1.tar.gz", hash = "sha256:65ee4ae9abdeca8fe3a9e351256345e3c4db2a6a68accd5d6a141cfff6192751"},
]
-typed-ast = [
- {file = "typed_ast-1.4.3-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:2068531575a125b87a41802130fa7e29f26c09a2833fea68d9a40cf33902eba6"},
- {file = "typed_ast-1.4.3-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:c907f561b1e83e93fad565bac5ba9c22d96a54e7ea0267c708bffe863cbe4075"},
- {file = "typed_ast-1.4.3-cp35-cp35m-manylinux2014_aarch64.whl", hash = "sha256:1b3ead4a96c9101bef08f9f7d1217c096f31667617b58de957f690c92378b528"},
- {file = "typed_ast-1.4.3-cp35-cp35m-win32.whl", hash = "sha256:dde816ca9dac1d9c01dd504ea5967821606f02e510438120091b84e852367428"},
- {file = "typed_ast-1.4.3-cp35-cp35m-win_amd64.whl", hash = "sha256:777a26c84bea6cd934422ac2e3b78863a37017618b6e5c08f92ef69853e765d3"},
- {file = "typed_ast-1.4.3-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:f8afcf15cc511ada719a88e013cec87c11aff7b91f019295eb4530f96fe5ef2f"},
- {file = "typed_ast-1.4.3-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:52b1eb8c83f178ab787f3a4283f68258525f8d70f778a2f6dd54d3b5e5fb4341"},
- {file = "typed_ast-1.4.3-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:01ae5f73431d21eead5015997ab41afa53aa1fbe252f9da060be5dad2c730ace"},
- {file = "typed_ast-1.4.3-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:c190f0899e9f9f8b6b7863debfb739abcb21a5c054f911ca3596d12b8a4c4c7f"},
- {file = "typed_ast-1.4.3-cp36-cp36m-win32.whl", hash = "sha256:398e44cd480f4d2b7ee8d98385ca104e35c81525dd98c519acff1b79bdaac363"},
- {file = "typed_ast-1.4.3-cp36-cp36m-win_amd64.whl", hash = "sha256:bff6ad71c81b3bba8fa35f0f1921fb24ff4476235a6e94a26ada2e54370e6da7"},
- {file = "typed_ast-1.4.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0fb71b8c643187d7492c1f8352f2c15b4c4af3f6338f21681d3681b3dc31a266"},
- {file = "typed_ast-1.4.3-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:760ad187b1041a154f0e4d0f6aae3e40fdb51d6de16e5c99aedadd9246450e9e"},
- {file = "typed_ast-1.4.3-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:5feca99c17af94057417d744607b82dd0a664fd5e4ca98061480fd8b14b18d04"},
- {file = "typed_ast-1.4.3-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:95431a26309a21874005845c21118c83991c63ea800dd44843e42a916aec5899"},
- {file = "typed_ast-1.4.3-cp37-cp37m-win32.whl", hash = "sha256:aee0c1256be6c07bd3e1263ff920c325b59849dc95392a05f258bb9b259cf39c"},
- {file = "typed_ast-1.4.3-cp37-cp37m-win_amd64.whl", hash = "sha256:9ad2c92ec681e02baf81fdfa056fe0d818645efa9af1f1cd5fd6f1bd2bdfd805"},
- {file = "typed_ast-1.4.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b36b4f3920103a25e1d5d024d155c504080959582b928e91cb608a65c3a49e1a"},
- {file = "typed_ast-1.4.3-cp38-cp38-manylinux1_i686.whl", hash = "sha256:067a74454df670dcaa4e59349a2e5c81e567d8d65458d480a5b3dfecec08c5ff"},
- {file = "typed_ast-1.4.3-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:7538e495704e2ccda9b234b82423a4038f324f3a10c43bc088a1636180f11a41"},
- {file = "typed_ast-1.4.3-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:af3d4a73793725138d6b334d9d247ce7e5f084d96284ed23f22ee626a7b88e39"},
- {file = "typed_ast-1.4.3-cp38-cp38-win32.whl", hash = "sha256:f2362f3cb0f3172c42938946dbc5b7843c2a28aec307c49100c8b38764eb6927"},
- {file = "typed_ast-1.4.3-cp38-cp38-win_amd64.whl", hash = "sha256:dd4a21253f42b8d2b48410cb31fe501d32f8b9fbeb1f55063ad102fe9c425e40"},
- {file = "typed_ast-1.4.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f328adcfebed9f11301eaedfa48e15bdece9b519fb27e6a8c01aa52a17ec31b3"},
- {file = "typed_ast-1.4.3-cp39-cp39-manylinux1_i686.whl", hash = "sha256:2c726c276d09fc5c414693a2de063f521052d9ea7c240ce553316f70656c84d4"},
- {file = "typed_ast-1.4.3-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:cae53c389825d3b46fb37538441f75d6aecc4174f615d048321b716df2757fb0"},
- {file = "typed_ast-1.4.3-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:b9574c6f03f685070d859e75c7f9eeca02d6933273b5e69572e5ff9d5e3931c3"},
- {file = "typed_ast-1.4.3-cp39-cp39-win32.whl", hash = "sha256:209596a4ec71d990d71d5e0d312ac935d86930e6eecff6ccc7007fe54d703808"},
- {file = "typed_ast-1.4.3-cp39-cp39-win_amd64.whl", hash = "sha256:9c6d1a54552b5330bc657b7ef0eae25d00ba7ffe85d9ea8ae6540d2197a3788c"},
- {file = "typed_ast-1.4.3.tar.gz", hash = "sha256:fb1bbeac803adea29cedd70781399c99138358c26d05fcbd23c13016b7f5ec65"},
-]
-typing-extensions = [
- {file = "typing_extensions-4.3.0-py3-none-any.whl", hash = "sha256:25642c956049920a5aa49edcdd6ab1e06d7e5d467fc00e0506c44ac86fbfca02"},
- {file = "typing_extensions-4.3.0.tar.gz", hash = "sha256:e6d2677a32f47fc7eb2795db1dd15c1f34eff616bcaf2cfb5e997f854fa1c4a6"},
-]
+typed-ast = []
+typing-extensions = []
urllib3 = [
{file = "urllib3-1.26.9-py2.py3-none-any.whl", hash = "sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14"},
{file = "urllib3-1.26.9.tar.gz", hash = "sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e"},
]
-wcwidth = [
- {file = "wcwidth-0.2.5-py2.py3-none-any.whl", hash = "sha256:beb4802a9cebb9144e99086eff703a642a13d6a0052920003a230f3294bbe784"},
- {file = "wcwidth-0.2.5.tar.gz", hash = "sha256:c4d647b99872929fdb7bdcaa4fbe7f01413ed3d98077df798530e5b04f116c83"},
-]
+watchdog = []
webencodings = [
{file = "webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78"},
{file = "webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923"},
]
-werkzeug = [
- {file = "Werkzeug-2.1.2-py3-none-any.whl", hash = "sha256:72a4b735692dd3135217911cbeaa1be5fa3f62bffb8745c5215420a03dc55255"},
- {file = "Werkzeug-2.1.2.tar.gz", hash = "sha256:1ce08e8093ed67d638d63879fd1ba3735817f7a80de3674d293f5984f25fb6e6"},
-]
-widgetsnbextension = [
- {file = "widgetsnbextension-3.6.1-py2.py3-none-any.whl", hash = "sha256:954e0faefdd414e4e013f17dbc7fd86f24cf1d243a3ac85d5f0fc2c2d2b50c66"},
- {file = "widgetsnbextension-3.6.1.tar.gz", hash = "sha256:9c84ae64c2893c7cbe2eaafc7505221a795c27d68938454034ac487319a75b10"},
-]
+werkzeug = []
xxhash = [
{file = "xxhash-3.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:219cba13991fd73cf21a5efdafa5056f0ae0b8f79e5e0112967e3058daf73eea"},
{file = "xxhash-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3fcbb846af15eff100c412ae54f4974ff277c92eacd41f1ec7803a64fd07fa0c"},
From 5ca981200ce01fd706d786131e8bd2e48b26b018 Mon Sep 17 00:00:00 2001
From: Dref360
Date: Sat, 16 Jul 2022 12:49:49 -0400
Subject: [PATCH 05/10] Add Colab button
---
docs/tutorials/index.md | 3 +-
docs/tutorials/label-studio.md | 4 +-
docs/user_guide/heuristics.md | 2 +-
mkdocs.yml | 1 -
notebooks/active_learning_process.ipynb | 2 +
notebooks/baal_prod_cls.ipynb | 51 +-
.../compatibility/nlp_classification.ipynb | 69 +-
.../compatibility/sklearn_tutorial.ipynb | 2 +
notebooks/deep_ensemble.ipynb | 2 +
notebooks/fairness/ActiveFairness.ipynb | 2 +
notebooks/fundamentals/active-learning.ipynb | 4 +-
notebooks/fundamentals/posteriors.ipynb | 2 +
poetry.lock | 1063 ++---------------
pyproject.toml | 2 +-
14 files changed, 153 insertions(+), 1056 deletions(-)
diff --git a/docs/tutorials/index.md b/docs/tutorials/index.md
index a8711f11..c1e3095b 100644
--- a/docs/tutorials/index.md
+++ b/docs/tutorials/index.md
@@ -13,4 +13,5 @@ latter on how we integrate with other common frameworks such as Label Studio, Hu
* [:material-link: Lightning Flash](https://devblog.pytorchlightning.ai/active-learning-made-simple-using-flash-and-baal-2216df6f872c)
* [HuggingFace](../notebooks/compatibility/nlp_classification.ipynb)
-* [Scikit-Learn](../notebooks/compatibility/sklearn_tutorial.ipynb)
\ No newline at end of file
+* [Scikit-Learn](../notebooks/compatibility/sklearn_tutorial.ipynb)
+* [Label Studio](./label-studio.md)
\ No newline at end of file
diff --git a/docs/tutorials/label-studio.md b/docs/tutorials/label-studio.md
index 206f103b..54832e7a 100644
--- a/docs/tutorials/label-studio.md
+++ b/docs/tutorials/label-studio.md
@@ -11,10 +11,12 @@ This is also a good way to start the conversation between your labelling team an
We will built upon Label Studio's [Pytorch transfer learning](https://github.com/heartexlabs/label-studio-ml-backend/blob/master/label_studio_ml/examples/pytorch_transfer_learning.py) example, so be sure to download it and try to run it before adding BaaL to it. The full example can be found [here](https://gist.github.com/Dref360/288845b2fbb0504e4cfc216a76b547e7).
More info:
+
* [BaaL documentation](https://baal.readthedocs.io/en/latest/)
* [Bayesian Deep Learning cheatsheet](https://baal.readthedocs.io/en/latest/user_guide/baal_cheatsheet.html)
Support:
+
* [Github](https://github.com/ElementAI/baal)
* [Gitter](https://gitter.im/eai-baal/community)
@@ -30,7 +32,7 @@ RUN pip install --no-cache \
uwsgi==2.0.19.1 \
supervisor==4.2.2 \
label-studio==1.0.2 \
- baal==1.3.0 \
+ baal \
click==7.1.2 \
git+https://github.com/heartexlabs/label-studio-ml-backend
```
diff --git a/docs/user_guide/heuristics.md b/docs/user_guide/heuristics.md
index 019f7ecb..ec82df4d 100644
--- a/docs/user_guide/heuristics.md
+++ b/docs/user_guide/heuristics.md
@@ -10,7 +10,7 @@ We will cover the two main heuristics: **Entropy** and **BALD**.
The goal of this heuristic is to maximize information. To do so, we will compute the entropy of each prediction before ordering them.
-Let $p_c(x)$ be the probability of input $x$ to be from class $c$. The entropy can be computed as:
+Let $p_{c}(x)$ be the probability of input $x$ to be from class $c$. The entropy can be computed as:
$$
H(x) = \sum_c^C p_c(x)
diff --git a/mkdocs.yml b/mkdocs.yml
index 5db0fab7..2febadb1 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -27,7 +27,6 @@ theme:
icon:
repo: fontawesome/brands/github
plugins:
- - search
- mkdocs-jupyter
- mkdocstrings
diff --git a/notebooks/active_learning_process.ipynb b/notebooks/active_learning_process.ipynb
index 37b91082..0bd05c6c 100644
--- a/notebooks/active_learning_process.ipynb
+++ b/notebooks/active_learning_process.ipynb
@@ -10,6 +10,8 @@
"source": [
"# How to do research and visualize progress\n",
"\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/active_learning_process.ipynb)\n",
+ "\n",
"In this tutorial, we will show how to use Baal for research ie. when we know the labels.\n",
"We will introduce notions such as dataset management, MC-Dropout, BALD. If you need more documentation, be sure to check our **Additional resources** section below!\n",
"\n",
diff --git a/notebooks/baal_prod_cls.ipynb b/notebooks/baal_prod_cls.ipynb
index 99f9a50c..01affdc2 100644
--- a/notebooks/baal_prod_cls.ipynb
+++ b/notebooks/baal_prod_cls.ipynb
@@ -11,6 +11,8 @@
"source": [
"# Use BaaL in production (Classification)\n",
"\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/baal_prod_cls.ipynb)\n",
+ "\n",
"In this tutorial, we will show you how to use BaaL during your labeling task.\n",
"\n",
"**NOTE** In this tutorial, we assume that we do not know the labels!\n",
@@ -378,55 +380,14 @@
},
{
"cell_type": "code",
- "execution_count": 10,
+ "execution_count": null,
"metadata": {
"pycharm": {
- "name": "#%%\n"
+ "name": "#%%\n",
+ "is_executing": true
}
},
- "outputs": [
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "Training on 110 items!\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T14:50:02.089160Z [\u001B[32minfo ] Starting training dataset=110 epoch=5\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T14:50:19.678241Z [\u001B[32minfo ] Training complete train_loss=1.9793428182601929\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T14:50:19.681509Z [\u001B[32minfo ] Starting evaluating dataset=1725\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T14:50:33.777658Z [\u001B[32minfo ] Evaluation complete test_loss=2.013453960418701\n",
- "Metrics: {'test_loss': 2.013453960418701, 'train_loss': 1.9793428182601929}\n",
- "[103-MainThread ] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T14:50:33.784990Z [\u001B[32minfo ] Start Predict dataset=5064\n",
- "Training on 120 items!\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T14:52:14.295969Z [\u001B[32minfo ] Starting training dataset=120 epoch=5\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T14:52:32.482238Z [\u001B[32minfo ] Training complete train_loss=1.8900309801101685\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T14:52:32.484473Z [\u001B[32minfo ] Starting evaluating dataset=1725\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T14:52:46.287436Z [\u001B[32minfo ] Evaluation complete test_loss=1.8315811157226562\n",
- "Metrics: {'test_loss': 1.8315811157226562, 'train_loss': 1.8900309801101685}\n",
- "[103-MainThread ] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T14:52:46.367016Z [\u001B[32minfo ] Start Predict dataset=5054\n",
- "Training on 130 items!\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T14:54:26.794349Z [\u001B[32minfo ] Starting training dataset=130 epoch=5\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T14:54:44.481490Z [\u001B[32minfo ] Training complete train_loss=1.961772084236145\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T14:54:44.483477Z [\u001B[32minfo ] Starting evaluating dataset=1725\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T14:54:58.268424Z [\u001B[32minfo ] Evaluation complete test_loss=1.859472393989563\n",
- "Metrics: {'test_loss': 1.859472393989563, 'train_loss': 1.961772084236145}\n",
- "[103-MainThread ] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T14:54:58.276565Z [\u001B[32minfo ] Start Predict dataset=5044\n",
- "Training on 140 items!\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T14:56:38.406344Z [\u001B[32minfo ] Starting training dataset=140 epoch=5\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T14:56:57.088064Z [\u001B[32minfo ] Training complete train_loss=1.8688158988952637\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T14:56:57.091358Z [\u001B[32minfo ] Starting evaluating dataset=1725\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T14:57:10.968456Z [\u001B[32minfo ] Evaluation complete test_loss=1.7242822647094727\n",
- "Metrics: {'test_loss': 1.7242822647094727, 'train_loss': 1.8688158988952637}\n",
- "[103-MainThread ] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T14:57:10.977104Z [\u001B[32minfo ] Start Predict dataset=5034\n",
- "Training on 150 items!\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:109] 2021-07-28T14:58:51.197386Z [\u001B[32minfo ] Starting training dataset=150 epoch=5\n",
- "[103-MainThread ] [baal.modelwrapper:train_on_dataset:119] 2021-07-28T14:59:09.779341Z [\u001B[32minfo ] Training complete train_loss=1.8381125926971436\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:147] 2021-07-28T14:59:09.782580Z [\u001B[32minfo ] Starting evaluating dataset=1725\n",
- "[103-MainThread ] [baal.modelwrapper:test_on_dataset:156] 2021-07-28T14:59:23.176680Z [\u001B[32minfo ] Evaluation complete test_loss=1.7318601608276367\n",
- "Metrics: {'test_loss': 1.7318601608276367, 'train_loss': 1.8381125926971436}\n",
- "[103-MainThread ] [baal.modelwrapper:predict_on_dataset_generator:241] 2021-07-28T14:59:23.184444Z [\u001B[32minfo ] Start Predict dataset=5024\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
"# 5. If not done, go back to 2.\n",
"for step in range(5): # 5 Active Learning step!\n",
diff --git a/notebooks/compatibility/nlp_classification.ipynb b/notebooks/compatibility/nlp_classification.ipynb
index e743c7a6..71c76144 100644
--- a/notebooks/compatibility/nlp_classification.ipynb
+++ b/notebooks/compatibility/nlp_classification.ipynb
@@ -3,9 +3,16 @@
{
"cell_type": "markdown",
"id": "still-resident",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"# Active Learning for NLP Classification\n",
+ "\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/compatibility/nlp_classification.ipynb)\n",
+ "\n",
"In this tutorial, we guide you through using our new [HuggingFace](https://huggingface.co/transformers/main_classes/trainer.html) trainer wrapper to do active learning with transformers models.\n",
" Any model which could be trained by HuggingFace trainer and has `Dropout` layers could be used in the same manner.\n",
"\n",
@@ -18,7 +25,11 @@
"cell_type": "code",
"execution_count": 1,
"id": "sixth-wound",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stderr",
@@ -37,7 +48,11 @@
{
"cell_type": "markdown",
"id": "mechanical-tennessee",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## ActiveLearning Dataset\n",
"In order to create an active learning dataset, we need to wrap the dataset with `baal.ActiveLearningDataset`.\n",
@@ -49,7 +64,11 @@
"cell_type": "code",
"execution_count": 2,
"id": "liquid-replacement",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stdout",
@@ -77,7 +96,11 @@
{
"cell_type": "markdown",
"id": "ready-participation",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## Active Learning Model\n",
"The process of making a model bayesian is exactly the same as before. In this case, we will get the `Bert` model and use `baal.bayesian.dropout.patch_module` to make the dropout layer stochastic at inference time. "
@@ -87,7 +110,11 @@
"cell_type": "code",
"execution_count": 3,
"id": "baking-coalition",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"name": "stderr",
@@ -119,7 +146,11 @@
{
"cell_type": "markdown",
"id": "eleven-portugal",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## Heuristic\n",
"\n",
@@ -133,7 +164,11 @@
"cell_type": "code",
"execution_count": 4,
"id": "cooperative-constant",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [],
"source": [
"from baal.active import get_heuristic\n",
@@ -144,7 +179,11 @@
{
"cell_type": "markdown",
"id": "listed-kelly",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ },
"source": [
"## HugginFace Trainer Wrapper\n",
"\n",
@@ -158,7 +197,11 @@
"cell_type": "code",
"execution_count": 5,
"id": "moving-olive",
- "metadata": {},
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
"outputs": [
{
"data": {
@@ -200,7 +243,7 @@
"name": "stdout",
"output_type": "stream",
"text": [
- "[93-MainThread ] [baal.transformers_trainer_wrapper:predict_on_dataset_generator:61] 2021-03-08T20:15:36.980534Z [\u001b[32minfo ] Start Predict dataset=67249\n"
+ "[93-MainThread ] [baal.transformers_trainer_wrapper:predict_on_dataset_generator:61] 2021-03-08T20:15:36.980534Z [\u001B[32minfo ] Start Predict dataset=67249\n"
]
},
{
@@ -394,7 +437,7 @@
"name": "stdout",
"output_type": "stream",
"text": [
- "[93-MainThread ] [baal.transformers_trainer_wrapper:predict_on_dataset_generator:61] 2021-03-08T20:28:15.903378Z [\u001b[32minfo ] Start Predict dataset=67239\n"
+ "[93-MainThread ] [baal.transformers_trainer_wrapper:predict_on_dataset_generator:61] 2021-03-08T20:28:15.903378Z [\u001B[32minfo ] Start Predict dataset=67239\n"
]
},
{
@@ -620,4 +663,4 @@
},
"nbformat": 4,
"nbformat_minor": 5
-}
+}
\ No newline at end of file
diff --git a/notebooks/compatibility/sklearn_tutorial.ipynb b/notebooks/compatibility/sklearn_tutorial.ipynb
index f227082c..c6f2c81a 100644
--- a/notebooks/compatibility/sklearn_tutorial.ipynb
+++ b/notebooks/compatibility/sklearn_tutorial.ipynb
@@ -11,6 +11,8 @@
"source": [
"# How to use BaaL with Scikit-Learn models\n",
"\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/compatibility/sklearn_tutorial.ipynb)\n",
+ "\n",
"In this tutorial, you will learn how to use BaaL on a scikit-learn model.\n",
"In this case, we will use `RandomForestClassifier`.\n",
"\n",
diff --git a/notebooks/deep_ensemble.ipynb b/notebooks/deep_ensemble.ipynb
index 4dac30b8..fbf15206 100644
--- a/notebooks/deep_ensemble.ipynb
+++ b/notebooks/deep_ensemble.ipynb
@@ -10,6 +10,8 @@
"source": [
"# How to use Deep ensembles in BaaL\n",
"\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/deep_ensemble.ipynb)\n",
+ "\n",
"Ensemble are one of the easiest form of Bayesian deep learning.\n",
" The main drawback from this approach is the important amount of computational resources needed to perform it.\n",
" In this notebook, we will present BaaL's Ensemble API namely `EnsembleModelWrapper`.\n",
diff --git a/notebooks/fairness/ActiveFairness.ipynb b/notebooks/fairness/ActiveFairness.ipynb
index 168ed79d..970c4924 100644
--- a/notebooks/fairness/ActiveFairness.ipynb
+++ b/notebooks/fairness/ActiveFairness.ipynb
@@ -11,6 +11,8 @@
"## Can active learning preemptively mitigate fairness issues?\n",
"*By Parmida Atighehchian*\n",
"\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/fairness/ActiveFairness.ipynb)\n",
+ "\n",
"The purpose of this notebook is to demonstrate the prilimary results of our recent [contribution](https://arxiv.org/abs/2104.06879) to ICLR workshop of Responsible AI 2021.\n",
"We show that active learning could help in creating fairer datasets without the need to know the bias in the dataset. This is important since in real scenarios, the source of bias is often unknown. Using active learning (i.e. BALD), we show that the prior knowledge of the bias is not necessary and hence it could be easier to integrate this setup in pipelines to make sure that the dataset is generally fairer and the possible biases are reduced. \n",
"\n",
diff --git a/notebooks/fundamentals/active-learning.ipynb b/notebooks/fundamentals/active-learning.ipynb
index 068e39fb..05b103b2 100644
--- a/notebooks/fundamentals/active-learning.ipynb
+++ b/notebooks/fundamentals/active-learning.ipynb
@@ -10,6 +10,8 @@
"source": [
"# Active learning infrastructure objects\n",
"\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/fundamentals/active-learning.ipynb)\n",
+ "\n",
"Active learning, or interactively choosing datapoints to request labels for,\n",
"presents a challenge that requires some data handling infrastructure that's\n",
"slightly different to the normal pytorch dataset classes. In particular, a\n",
@@ -38,7 +40,7 @@
},
"outputs": [],
"source": [
- "path = \"/Users/jan/datasets/mnist/\""
+ "path = \"/tmp\""
]
},
{
diff --git a/notebooks/fundamentals/posteriors.ipynb b/notebooks/fundamentals/posteriors.ipynb
index 0acb9254..d59fe1a1 100644
--- a/notebooks/fundamentals/posteriors.ipynb
+++ b/notebooks/fundamentals/posteriors.ipynb
@@ -10,6 +10,8 @@
"source": [
"# Methods for approximating bayesian posteriors \n",
"\n",
+ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/baal-org/baal/blob/master/notebooks/fundamentals/posteriors.ipynb)\n",
+ "\n",
"When we started developing active learning methods, we realised that what we wanted to\n",
"achieve required estimating the uncertainty of models. Doing so for neural networks is\n",
"an ongoing active research area.\n",
diff --git a/poetry.lock b/poetry.lock
index 4829b32c..d6e6b225 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -2085,223 +2085,42 @@ vision = ["torchvision", "lightning-flash"]
[metadata]
lock-version = "1.1"
python-versions = ">=3.7.1,<3.11"
-content-hash = "b078b2c2d31acd58eeebb4073f106d97bffe25443962a34129c5be61c91eb51b"
+content-hash = "8b24bf08ccb4adbc089aba994c49152a88064877e50237bb5747f814f7dfd15b"
[metadata.files]
absl-py = []
-aiohttp = [
- {file = "aiohttp-3.8.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1ed0b6477896559f17b9eaeb6d38e07f7f9ffe40b9f0f9627ae8b9926ae260a8"},
- {file = "aiohttp-3.8.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7dadf3c307b31e0e61689cbf9e06be7a867c563d5a63ce9dca578f956609abf8"},
- {file = "aiohttp-3.8.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a79004bb58748f31ae1cbe9fa891054baaa46fb106c2dc7af9f8e3304dc30316"},
- {file = "aiohttp-3.8.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:12de6add4038df8f72fac606dff775791a60f113a725c960f2bab01d8b8e6b15"},
- {file = "aiohttp-3.8.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6f0d5f33feb5f69ddd57a4a4bd3d56c719a141080b445cbf18f238973c5c9923"},
- {file = "aiohttp-3.8.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eaba923151d9deea315be1f3e2b31cc39a6d1d2f682f942905951f4e40200922"},
- {file = "aiohttp-3.8.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:099ebd2c37ac74cce10a3527d2b49af80243e2a4fa39e7bce41617fbc35fa3c1"},
- {file = "aiohttp-3.8.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2e5d962cf7e1d426aa0e528a7e198658cdc8aa4fe87f781d039ad75dcd52c516"},
- {file = "aiohttp-3.8.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:fa0ffcace9b3aa34d205d8130f7873fcfefcb6a4dd3dd705b0dab69af6712642"},
- {file = "aiohttp-3.8.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:61bfc23df345d8c9716d03717c2ed5e27374e0fe6f659ea64edcd27b4b044cf7"},
- {file = "aiohttp-3.8.1-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:31560d268ff62143e92423ef183680b9829b1b482c011713ae941997921eebc8"},
- {file = "aiohttp-3.8.1-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:01d7bdb774a9acc838e6b8f1d114f45303841b89b95984cbb7d80ea41172a9e3"},
- {file = "aiohttp-3.8.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:97ef77eb6b044134c0b3a96e16abcb05ecce892965a2124c566af0fd60f717e2"},
- {file = "aiohttp-3.8.1-cp310-cp310-win32.whl", hash = "sha256:c2aef4703f1f2ddc6df17519885dbfa3514929149d3ff900b73f45998f2532fa"},
- {file = "aiohttp-3.8.1-cp310-cp310-win_amd64.whl", hash = "sha256:713ac174a629d39b7c6a3aa757b337599798da4c1157114a314e4e391cd28e32"},
- {file = "aiohttp-3.8.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:473d93d4450880fe278696549f2e7aed8cd23708c3c1997981464475f32137db"},
- {file = "aiohttp-3.8.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99b5eeae8e019e7aad8af8bb314fb908dd2e028b3cdaad87ec05095394cce632"},
- {file = "aiohttp-3.8.1-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3af642b43ce56c24d063325dd2cf20ee012d2b9ba4c3c008755a301aaea720ad"},
- {file = "aiohttp-3.8.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3630c3ef435c0a7c549ba170a0633a56e92629aeed0e707fec832dee313fb7a"},
- {file = "aiohttp-3.8.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:4a4a4e30bf1edcad13fb0804300557aedd07a92cabc74382fdd0ba6ca2661091"},
- {file = "aiohttp-3.8.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6f8b01295e26c68b3a1b90efb7a89029110d3a4139270b24fda961893216c440"},
- {file = "aiohttp-3.8.1-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:a25fa703a527158aaf10dafd956f7d42ac6d30ec80e9a70846253dd13e2f067b"},
- {file = "aiohttp-3.8.1-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:5bfde62d1d2641a1f5173b8c8c2d96ceb4854f54a44c23102e2ccc7e02f003ec"},
- {file = "aiohttp-3.8.1-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:51467000f3647d519272392f484126aa716f747859794ac9924a7aafa86cd411"},
- {file = "aiohttp-3.8.1-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:03a6d5349c9ee8f79ab3ff3694d6ce1cfc3ced1c9d36200cb8f08ba06bd3b782"},
- {file = "aiohttp-3.8.1-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:102e487eeb82afac440581e5d7f8f44560b36cf0bdd11abc51a46c1cd88914d4"},
- {file = "aiohttp-3.8.1-cp36-cp36m-win32.whl", hash = "sha256:4aed991a28ea3ce320dc8ce655875e1e00a11bdd29fe9444dd4f88c30d558602"},
- {file = "aiohttp-3.8.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b0e20cddbd676ab8a64c774fefa0ad787cc506afd844de95da56060348021e96"},
- {file = "aiohttp-3.8.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:37951ad2f4a6df6506750a23f7cbabad24c73c65f23f72e95897bb2cecbae676"},
- {file = "aiohttp-3.8.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5c23b1ad869653bc818e972b7a3a79852d0e494e9ab7e1a701a3decc49c20d51"},
- {file = "aiohttp-3.8.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:15b09b06dae900777833fe7fc4b4aa426556ce95847a3e8d7548e2d19e34edb8"},
- {file = "aiohttp-3.8.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:477c3ea0ba410b2b56b7efb072c36fa91b1e6fc331761798fa3f28bb224830dd"},
- {file = "aiohttp-3.8.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:2f2f69dca064926e79997f45b2f34e202b320fd3782f17a91941f7eb85502ee2"},
- {file = "aiohttp-3.8.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:ef9612483cb35171d51d9173647eed5d0069eaa2ee812793a75373447d487aa4"},
- {file = "aiohttp-3.8.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:6d69f36d445c45cda7b3b26afef2fc34ef5ac0cdc75584a87ef307ee3c8c6d00"},
- {file = "aiohttp-3.8.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:55c3d1072704d27401c92339144d199d9de7b52627f724a949fc7d5fc56d8b93"},
- {file = "aiohttp-3.8.1-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:b9d00268fcb9f66fbcc7cd9fe423741d90c75ee029a1d15c09b22d23253c0a44"},
- {file = "aiohttp-3.8.1-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:07b05cd3305e8a73112103c834e91cd27ce5b4bd07850c4b4dbd1877d3f45be7"},
- {file = "aiohttp-3.8.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:c34dc4958b232ef6188c4318cb7b2c2d80521c9a56c52449f8f93ab7bc2a8a1c"},
- {file = "aiohttp-3.8.1-cp37-cp37m-win32.whl", hash = "sha256:d2f9b69293c33aaa53d923032fe227feac867f81682f002ce33ffae978f0a9a9"},
- {file = "aiohttp-3.8.1-cp37-cp37m-win_amd64.whl", hash = "sha256:6ae828d3a003f03ae31915c31fa684b9890ea44c9c989056fea96e3d12a9fa17"},
- {file = "aiohttp-3.8.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:0c7ebbbde809ff4e970824b2b6cb7e4222be6b95a296e46c03cf050878fc1785"},
- {file = "aiohttp-3.8.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8b7ef7cbd4fec9a1e811a5de813311ed4f7ac7d93e0fda233c9b3e1428f7dd7b"},
- {file = "aiohttp-3.8.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:c3d6a4d0619e09dcd61021debf7059955c2004fa29f48788a3dfaf9c9901a7cd"},
- {file = "aiohttp-3.8.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:718626a174e7e467f0558954f94af117b7d4695d48eb980146016afa4b580b2e"},
- {file = "aiohttp-3.8.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:589c72667a5febd36f1315aa6e5f56dd4aa4862df295cb51c769d16142ddd7cd"},
- {file = "aiohttp-3.8.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2ed076098b171573161eb146afcb9129b5ff63308960aeca4b676d9d3c35e700"},
- {file = "aiohttp-3.8.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:086f92daf51a032d062ec5f58af5ca6a44d082c35299c96376a41cbb33034675"},
- {file = "aiohttp-3.8.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:11691cf4dc5b94236ccc609b70fec991234e7ef8d4c02dd0c9668d1e486f5abf"},
- {file = "aiohttp-3.8.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:31d1e1c0dbf19ebccbfd62eff461518dcb1e307b195e93bba60c965a4dcf1ba0"},
- {file = "aiohttp-3.8.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:11a67c0d562e07067c4e86bffc1553f2cf5b664d6111c894671b2b8712f3aba5"},
- {file = "aiohttp-3.8.1-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:bb01ba6b0d3f6c68b89fce7305080145d4877ad3acaed424bae4d4ee75faa950"},
- {file = "aiohttp-3.8.1-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:44db35a9e15d6fe5c40d74952e803b1d96e964f683b5a78c3cc64eb177878155"},
- {file = "aiohttp-3.8.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:844a9b460871ee0a0b0b68a64890dae9c415e513db0f4a7e3cab41a0f2fedf33"},
- {file = "aiohttp-3.8.1-cp38-cp38-win32.whl", hash = "sha256:7d08744e9bae2ca9c382581f7dce1273fe3c9bae94ff572c3626e8da5b193c6a"},
- {file = "aiohttp-3.8.1-cp38-cp38-win_amd64.whl", hash = "sha256:04d48b8ce6ab3cf2097b1855e1505181bdd05586ca275f2505514a6e274e8e75"},
- {file = "aiohttp-3.8.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:f5315a2eb0239185af1bddb1abf472d877fede3cc8d143c6cddad37678293237"},
- {file = "aiohttp-3.8.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a996d01ca39b8dfe77440f3cd600825d05841088fd6bc0144cc6c2ec14cc5f74"},
- {file = "aiohttp-3.8.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:13487abd2f761d4be7c8ff9080de2671e53fff69711d46de703c310c4c9317ca"},
- {file = "aiohttp-3.8.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea302f34477fda3f85560a06d9ebdc7fa41e82420e892fc50b577e35fc6a50b2"},
- {file = "aiohttp-3.8.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a2f635ce61a89c5732537a7896b6319a8fcfa23ba09bec36e1b1ac0ab31270d2"},
- {file = "aiohttp-3.8.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e999f2d0e12eea01caeecb17b653f3713d758f6dcc770417cf29ef08d3931421"},
- {file = "aiohttp-3.8.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0770e2806a30e744b4e21c9d73b7bee18a1cfa3c47991ee2e5a65b887c49d5cf"},
- {file = "aiohttp-3.8.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:d15367ce87c8e9e09b0f989bfd72dc641bcd04ba091c68cd305312d00962addd"},
- {file = "aiohttp-3.8.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:6c7cefb4b0640703eb1069835c02486669312bf2f12b48a748e0a7756d0de33d"},
- {file = "aiohttp-3.8.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:71927042ed6365a09a98a6377501af5c9f0a4d38083652bcd2281a06a5976724"},
- {file = "aiohttp-3.8.1-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:28d490af82bc6b7ce53ff31337a18a10498303fe66f701ab65ef27e143c3b0ef"},
- {file = "aiohttp-3.8.1-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:b6613280ccedf24354406caf785db748bebbddcf31408b20c0b48cb86af76866"},
- {file = "aiohttp-3.8.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:81e3d8c34c623ca4e36c46524a3530e99c0bc95ed068fd6e9b55cb721d408fb2"},
- {file = "aiohttp-3.8.1-cp39-cp39-win32.whl", hash = "sha256:7187a76598bdb895af0adbd2fb7474d7f6025d170bc0a1130242da817ce9e7d1"},
- {file = "aiohttp-3.8.1-cp39-cp39-win_amd64.whl", hash = "sha256:1c182cb873bc91b411e184dab7a2b664d4fea2743df0e4d57402f7f3fa644bac"},
- {file = "aiohttp-3.8.1.tar.gz", hash = "sha256:fc5471e1a54de15ef71c1bc6ebe80d4dc681ea600e68bfd1cbce40427f0b7578"},
-]
-aiosignal = [
- {file = "aiosignal-1.2.0-py3-none-any.whl", hash = "sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a"},
- {file = "aiosignal-1.2.0.tar.gz", hash = "sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2"},
-]
+aiohttp = []
+aiosignal = []
astunparse = []
-async-timeout = [
- {file = "async-timeout-4.0.2.tar.gz", hash = "sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15"},
- {file = "async_timeout-4.0.2-py3-none-any.whl", hash = "sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c"},
-]
+async-timeout = []
asynctest = []
-atomicwrites = [
- {file = "atomicwrites-1.4.0-py2.py3-none-any.whl", hash = "sha256:6d1784dea7c0c8d4a5172b6c620f40b6e4cbfdf96d783691f2e1302a7b88e197"},
- {file = "atomicwrites-1.4.0.tar.gz", hash = "sha256:ae70396ad1a434f9c7046fd2dd196fc04b12f9e91ffb859164193be8b6168a7a"},
-]
-attrs = [
- {file = "attrs-21.4.0-py2.py3-none-any.whl", hash = "sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4"},
- {file = "attrs-21.4.0.tar.gz", hash = "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"},
-]
+atomicwrites = []
+attrs = []
bandit = []
-beautifulsoup4 = [
- {file = "beautifulsoup4-4.11.1-py3-none-any.whl", hash = "sha256:58d5c3d29f5a36ffeb94f02f0d786cd53014cf9b3b3951d42e0080d8a9498d30"},
- {file = "beautifulsoup4-4.11.1.tar.gz", hash = "sha256:ad9aa55b65ef2808eb405f46cf74df7fcb7044d5cbc26487f96eb2ef2e436693"},
-]
+beautifulsoup4 = []
black = []
bleach = []
cached-property = []
cachetools = []
-certifi = [
- {file = "certifi-2022.6.15-py3-none-any.whl", hash = "sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412"},
- {file = "certifi-2022.6.15.tar.gz", hash = "sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d"},
-]
+certifi = []
cffi = []
charset-normalizer = []
-click = [
- {file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
- {file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"},
-]
-colorama = [
- {file = "colorama-0.4.5-py2.py3-none-any.whl", hash = "sha256:854bf444933e37f5824ae7bfc1e98d5bce2ebe4160d46b5edf346a89358e99da"},
- {file = "colorama-0.4.5.tar.gz", hash = "sha256:e6c6b4334fc50988a639d9b98aa429a0b57da6e17b9a44f0451f930b6967b7a4"},
-]
+click = []
+colorama = []
coverage = []
-cycler = [
- {file = "cycler-0.11.0-py3-none-any.whl", hash = "sha256:3a27e95f763a428a739d2add979fa7494c912a32c17c4c38c4d5f082cad165a3"},
- {file = "cycler-0.11.0.tar.gz", hash = "sha256:9c87405839a19696e837b3b818fed3f5f69f16f1eec1a1ad77e043dcea9c772f"},
-]
+cycler = []
datasets = []
-defusedxml = [
- {file = "defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61"},
- {file = "defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69"},
-]
-dill = [
- {file = "dill-0.3.5.1-py2.py3-none-any.whl", hash = "sha256:33501d03270bbe410c72639b350e941882a8b0fd55357580fbc873fba0c59302"},
- {file = "dill-0.3.5.1.tar.gz", hash = "sha256:d75e41f3eff1eee599d738e76ba8f4ad98ea229db8b085318aa2b3333a208c86"},
-]
+defusedxml = []
+dill = []
docstring-parser = []
docutils = []
-entrypoints = [
- {file = "entrypoints-0.4-py3-none-any.whl", hash = "sha256:f174b5ff827504fd3cd97cc3f8649f3693f51538c7e4bdf3ef002c8429d42f9f"},
- {file = "entrypoints-0.4.tar.gz", hash = "sha256:b706eddaa9218a19ebcd67b56818f05bb27589b1ca9e8d797b74affad4ccacd4"},
-]
-fastjsonschema = [
- {file = "fastjsonschema-2.15.3-py3-none-any.whl", hash = "sha256:ddb0b1d8243e6e3abb822bd14e447a89f4ab7439342912d590444831fa00b6a0"},
- {file = "fastjsonschema-2.15.3.tar.gz", hash = "sha256:0a572f0836962d844c1fc435e200b2e4f4677e4e6611a2e3bdd01ba697c275ec"},
-]
-filelock = [
- {file = "filelock-3.7.1-py3-none-any.whl", hash = "sha256:37def7b658813cda163b56fc564cdc75e86d338246458c4c28ae84cabefa2404"},
- {file = "filelock-3.7.1.tar.gz", hash = "sha256:3a0fd85166ad9dbab54c9aec96737b744106dc5f15c0b09a6744a445299fcf04"},
-]
+entrypoints = []
+fastjsonschema = []
+filelock = []
flake8 = []
-fonttools = [
- {file = "fonttools-4.33.3-py3-none-any.whl", hash = "sha256:f829c579a8678fa939a1d9e9894d01941db869de44390adb49ce67055a06cc2a"},
- {file = "fonttools-4.33.3.zip", hash = "sha256:c0fdcfa8ceebd7c1b2021240bd46ef77aa8e7408cf10434be55df52384865f8e"},
-]
-frozenlist = [
- {file = "frozenlist-1.3.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d2257aaba9660f78c7b1d8fea963b68f3feffb1a9d5d05a18401ca9eb3e8d0a3"},
- {file = "frozenlist-1.3.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4a44ebbf601d7bac77976d429e9bdb5a4614f9f4027777f9e54fd765196e9d3b"},
- {file = "frozenlist-1.3.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:45334234ec30fc4ea677f43171b18a27505bfb2dba9aca4398a62692c0ea8868"},
- {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:47be22dc27ed933d55ee55845d34a3e4e9f6fee93039e7f8ebadb0c2f60d403f"},
- {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03a7dd1bfce30216a3f51a84e6dd0e4a573d23ca50f0346634916ff105ba6e6b"},
- {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:691ddf6dc50480ce49f68441f1d16a4c3325887453837036e0fb94736eae1e58"},
- {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bde99812f237f79eaf3f04ebffd74f6718bbd216101b35ac7955c2d47c17da02"},
- {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6a202458d1298ced3768f5a7d44301e7c86defac162ace0ab7434c2e961166e8"},
- {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b9e3e9e365991f8cc5f5edc1fd65b58b41d0514a6a7ad95ef5c7f34eb49b3d3e"},
- {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:04cb491c4b1c051734d41ea2552fde292f5f3a9c911363f74f39c23659c4af78"},
- {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:436496321dad302b8b27ca955364a439ed1f0999311c393dccb243e451ff66aa"},
- {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:754728d65f1acc61e0f4df784456106e35afb7bf39cfe37227ab00436fb38676"},
- {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6eb275c6385dd72594758cbe96c07cdb9bd6becf84235f4a594bdf21e3596c9d"},
- {file = "frozenlist-1.3.0-cp310-cp310-win32.whl", hash = "sha256:e30b2f9683812eb30cf3f0a8e9f79f8d590a7999f731cf39f9105a7c4a39489d"},
- {file = "frozenlist-1.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:f7353ba3367473d1d616ee727945f439e027f0bb16ac1a750219a8344d1d5d3c"},
- {file = "frozenlist-1.3.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:88aafd445a233dbbf8a65a62bc3249a0acd0d81ab18f6feb461cc5a938610d24"},
- {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4406cfabef8f07b3b3af0f50f70938ec06d9f0fc26cbdeaab431cbc3ca3caeaa"},
- {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8cf829bd2e2956066dd4de43fd8ec881d87842a06708c035b37ef632930505a2"},
- {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:603b9091bd70fae7be28bdb8aa5c9990f4241aa33abb673390a7f7329296695f"},
- {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:25af28b560e0c76fa41f550eacb389905633e7ac02d6eb3c09017fa1c8cdfde1"},
- {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:94c7a8a9fc9383b52c410a2ec952521906d355d18fccc927fca52ab575ee8b93"},
- {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:65bc6e2fece04e2145ab6e3c47428d1bbc05aede61ae365b2c1bddd94906e478"},
- {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:3f7c935c7b58b0d78c0beea0c7358e165f95f1fd8a7e98baa40d22a05b4a8141"},
- {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd89acd1b8bb4f31b47072615d72e7f53a948d302b7c1d1455e42622de180eae"},
- {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:6983a31698490825171be44ffbafeaa930ddf590d3f051e397143a5045513b01"},
- {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:adac9700675cf99e3615eb6a0eb5e9f5a4143c7d42c05cea2e7f71c27a3d0846"},
- {file = "frozenlist-1.3.0-cp37-cp37m-win32.whl", hash = "sha256:0c36e78b9509e97042ef869c0e1e6ef6429e55817c12d78245eb915e1cca7468"},
- {file = "frozenlist-1.3.0-cp37-cp37m-win_amd64.whl", hash = "sha256:57f4d3f03a18facacb2a6bcd21bccd011e3b75d463dc49f838fd699d074fabd1"},
- {file = "frozenlist-1.3.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:8c905a5186d77111f02144fab5b849ab524f1e876a1e75205cd1386a9be4b00a"},
- {file = "frozenlist-1.3.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b5009062d78a8c6890d50b4e53b0ddda31841b3935c1937e2ed8c1bda1c7fb9d"},
- {file = "frozenlist-1.3.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2fdc3cd845e5a1f71a0c3518528bfdbfe2efaf9886d6f49eacc5ee4fd9a10953"},
- {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:92e650bd09b5dda929523b9f8e7f99b24deac61240ecc1a32aeba487afcd970f"},
- {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:40dff8962b8eba91fd3848d857203f0bd704b5f1fa2b3fc9af64901a190bba08"},
- {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:768efd082074bb203c934e83a61654ed4931ef02412c2fbdecea0cff7ecd0274"},
- {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:006d3595e7d4108a12025ddf415ae0f6c9e736e726a5db0183326fd191b14c5e"},
- {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:871d42623ae15eb0b0e9df65baeee6976b2e161d0ba93155411d58ff27483ad8"},
- {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:aff388be97ef2677ae185e72dc500d19ecaf31b698986800d3fc4f399a5e30a5"},
- {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:9f892d6a94ec5c7b785e548e42722e6f3a52f5f32a8461e82ac3e67a3bd073f1"},
- {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:e982878792c971cbd60ee510c4ee5bf089a8246226dea1f2138aa0bb67aff148"},
- {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:c6c321dd013e8fc20735b92cb4892c115f5cdb82c817b1e5b07f6b95d952b2f0"},
- {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:30530930410855c451bea83f7b272fb1c495ed9d5cc72895ac29e91279401db3"},
- {file = "frozenlist-1.3.0-cp38-cp38-win32.whl", hash = "sha256:40ec383bc194accba825fbb7d0ef3dda5736ceab2375462f1d8672d9f6b68d07"},
- {file = "frozenlist-1.3.0-cp38-cp38-win_amd64.whl", hash = "sha256:f20baa05eaa2bcd5404c445ec51aed1c268d62600362dc6cfe04fae34a424bd9"},
- {file = "frozenlist-1.3.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:0437fe763fb5d4adad1756050cbf855bbb2bf0d9385c7bb13d7a10b0dd550486"},
- {file = "frozenlist-1.3.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b684c68077b84522b5c7eafc1dc735bfa5b341fb011d5552ebe0968e22ed641c"},
- {file = "frozenlist-1.3.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:93641a51f89473837333b2f8100f3f89795295b858cd4c7d4a1f18e299dc0a4f"},
- {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d6d32ff213aef0fd0bcf803bffe15cfa2d4fde237d1d4838e62aec242a8362fa"},
- {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31977f84828b5bb856ca1eb07bf7e3a34f33a5cddce981d880240ba06639b94d"},
- {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3c62964192a1c0c30b49f403495911298810bada64e4f03249ca35a33ca0417a"},
- {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4eda49bea3602812518765810af732229b4291d2695ed24a0a20e098c45a707b"},
- {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acb267b09a509c1df5a4ca04140da96016f40d2ed183cdc356d237286c971b51"},
- {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e1e26ac0a253a2907d654a37e390904426d5ae5483150ce3adedb35c8c06614a"},
- {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f96293d6f982c58ebebb428c50163d010c2f05de0cde99fd681bfdc18d4b2dc2"},
- {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:e84cb61b0ac40a0c3e0e8b79c575161c5300d1d89e13c0e02f76193982f066ed"},
- {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:ff9310f05b9d9c5c4dd472983dc956901ee6cb2c3ec1ab116ecdde25f3ce4951"},
- {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d26b650b71fdc88065b7a21f8ace70175bcf3b5bdba5ea22df4bfd893e795a3b"},
- {file = "frozenlist-1.3.0-cp39-cp39-win32.whl", hash = "sha256:01a73627448b1f2145bddb6e6c2259988bb8aee0fb361776ff8604b99616cd08"},
- {file = "frozenlist-1.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:772965f773757a6026dea111a15e6e2678fbd6216180f82a48a40b27de1ee2ab"},
- {file = "frozenlist-1.3.0.tar.gz", hash = "sha256:ce6f2ba0edb7b0c1d8976565298ad2deba6f8064d2bebb6ffce2ca896eb35b0b"},
-]
-fsspec = [
- {file = "fsspec-2022.5.0-py3-none-any.whl", hash = "sha256:2c198c50eb541a80bbd03540b07602c4a957366f3fb416a1f270d34bd4ff0926"},
- {file = "fsspec-2022.5.0.tar.gz", hash = "sha256:7a5459c75c44e760fbe6a3ccb1f37e81e023cde7da8ba20401258d877ec483b4"},
-]
+fonttools = []
+frozenlist = []
+fsspec = []
ghp-import = []
gitdb = []
gitpython = []
@@ -2310,182 +2129,31 @@ google-auth-oauthlib = []
griffe = []
grpcio = []
h5py = []
-huggingface-hub = [
- {file = "huggingface_hub-0.8.1-py3-none-any.whl", hash = "sha256:a11fb8d696a26f927833d46b7633105fd864fd92a2beb1140cbf1b2f703dedb3"},
- {file = "huggingface_hub-0.8.1.tar.gz", hash = "sha256:75c70797da54b849f06c2cbf7ba2217250ee217230b9f65547d5db3c5bd84bb5"},
-]
+huggingface-hub = []
hypothesis = []
-idna = [
- {file = "idna-3.3-py3-none-any.whl", hash = "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff"},
- {file = "idna-3.3.tar.gz", hash = "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"},
-]
+idna = []
importlib-metadata = []
importlib-resources = []
-iniconfig = [
- {file = "iniconfig-1.1.1-py2.py3-none-any.whl", hash = "sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3"},
- {file = "iniconfig-1.1.1.tar.gz", hash = "sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32"},
-]
-jinja2 = [
- {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
- {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
-]
-joblib = [
- {file = "joblib-1.1.0-py2.py3-none-any.whl", hash = "sha256:f21f109b3c7ff9d95f8387f752d0d9c34a02aa2f7060c2135f465da0e5160ff6"},
- {file = "joblib-1.1.0.tar.gz", hash = "sha256:4158fcecd13733f8be669be0683b96ebdbbd38d23559f54dca7205aea1bf1e35"},
-]
+iniconfig = []
+jinja2 = []
+joblib = []
jsonargparse = []
jsonschema = []
-jupyter-client = [
- {file = "jupyter_client-7.3.4-py3-none-any.whl", hash = "sha256:17d74b0d0a7b24f1c8c527b24fcf4607c56bee542ffe8e3418e50b21e514b621"},
- {file = "jupyter_client-7.3.4.tar.gz", hash = "sha256:aa9a6c32054b290374f95f73bb0cae91455c58dfb84f65c8591912b8f65e6d56"},
-]
-jupyter-core = [
- {file = "jupyter_core-4.10.0-py3-none-any.whl", hash = "sha256:e7f5212177af7ab34179690140f188aa9bf3d322d8155ed972cbded19f55b6f3"},
- {file = "jupyter_core-4.10.0.tar.gz", hash = "sha256:a6de44b16b7b31d7271130c71a6792c4040f077011961138afed5e5e73181aec"},
-]
-jupyterlab-pygments = [
- {file = "jupyterlab_pygments-0.2.2-py2.py3-none-any.whl", hash = "sha256:2405800db07c9f770863bcf8049a529c3dd4d3e28536638bd7c1c01d2748309f"},
- {file = "jupyterlab_pygments-0.2.2.tar.gz", hash = "sha256:7405d7fde60819d905a9fa8ce89e4cd830e318cdad22a0030f7a901da705585d"},
-]
+jupyter-client = []
+jupyter-core = []
+jupyterlab-pygments = []
jupytext = []
-kiwisolver = [
- {file = "kiwisolver-1.4.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:fd2842a0faed9ab9aba0922c951906132d9384be89690570f0ed18cd4f20e658"},
- {file = "kiwisolver-1.4.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:caa59e2cae0e23b1e225447d7a9ddb0f982f42a6a22d497a484dfe62a06f7c0e"},
- {file = "kiwisolver-1.4.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1d2c744aeedce22c122bb42d176b4aa6d063202a05a4abdacb3e413c214b3694"},
- {file = "kiwisolver-1.4.3-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:afe173ac2646c2636305ab820cc0380b22a00a7bca4290452e7166b4f4fa49d0"},
- {file = "kiwisolver-1.4.3-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:40240da438c0ebfe2aa76dd04b844effac6679423df61adbe3437d32f23468d9"},
- {file = "kiwisolver-1.4.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21a3a98f0a21fc602663ca9bce2b12a4114891bdeba2dea1e9ad84db59892fca"},
- {file = "kiwisolver-1.4.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:51078855a16b7a4984ed2067b54e35803d18bca9861cb60c60f6234b50869a56"},
- {file = "kiwisolver-1.4.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c16635f8dddbeb1b827977d0b00d07b644b040aeb9ff8607a9fc0997afa3e567"},
- {file = "kiwisolver-1.4.3-cp310-cp310-win32.whl", hash = "sha256:2d76780d9c65c7529cedd49fa4802d713e60798d8dc3b0d5b12a0a8f38cca51c"},
- {file = "kiwisolver-1.4.3-cp310-cp310-win_amd64.whl", hash = "sha256:3a297d77b3d6979693f5948df02b89431ae3645ec95865e351fb45578031bdae"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:ca3eefb02ef17257fae8b8555c85e7c1efdfd777f671384b0e4ef27409b02720"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d248c46c0aa406695bda2abf99632db991f8b3a6d46018721a2892312a99f069"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cb55258931448d61e2d50187de4ee66fc9d9f34908b524949b8b2b93d0c57136"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:86bcf0009f2012847a688f2f4f9b16203ca4c835979a02549aa0595d9f457cc8"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e7cf940af5fee00a92e281eb157abe8770227a5255207818ea9a34e54a29f5b2"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:dd22085446f3eca990d12a0878eeb5199dc9553b2e71716bfe7bed9915a472ab"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-win32.whl", hash = "sha256:d2578e5149ff49878934debfacf5c743fab49eca5ecdb983d0b218e1e554c498"},
- {file = "kiwisolver-1.4.3-cp37-cp37m-win_amd64.whl", hash = "sha256:5fb73cc8a34baba1dfa546ae83b9c248ef6150c238b06fc53d2773685b67ec67"},
- {file = "kiwisolver-1.4.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:f70f3d028794e31cf9d1a822914efc935aadb2438ec4e8d4871d95eb1ce032d6"},
- {file = "kiwisolver-1.4.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:71af5b43e4fa286a35110fc5bb740fdeae2b36ca79fbcf0a54237485baeee8be"},
- {file = "kiwisolver-1.4.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:26b5a70bdab09e6a2f40babc4f8f992e3771751e144bda1938084c70d3001c09"},
- {file = "kiwisolver-1.4.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1858ad3cb686eccc7c6b7c5eac846a1cfd45aacb5811b2cf575e80b208f5622a"},
- {file = "kiwisolver-1.4.3-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4dc350cb65fe4e3f737d50f0465fa6ea0dcae0e5722b7edf5d5b0a0e3cd2c3c7"},
- {file = "kiwisolver-1.4.3-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:007799c7fa934646318fc128b033bb6e6baabe7fbad521bfb2279aac26225cd7"},
- {file = "kiwisolver-1.4.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:46fb56fde006b7ef5f8eaa3698299b0ea47444238b869ff3ced1426aa9fedcb5"},
- {file = "kiwisolver-1.4.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:b9eb88593159a53a5ee0b0159daee531ff7dd9c87fa78f5d807ca059c7eb1b2b"},
- {file = "kiwisolver-1.4.3-cp38-cp38-win32.whl", hash = "sha256:3b1dcbc49923ac3c973184a82832e1f018dec643b1e054867d04a3a22255ec6a"},
- {file = "kiwisolver-1.4.3-cp38-cp38-win_amd64.whl", hash = "sha256:7118ca592d25b2957ff7b662bc0fe4f4c2b5d5b27814b9b1bc9f2fb249a970e7"},
- {file = "kiwisolver-1.4.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:747190fcdadc377263223f8f72b038381b3b549a8a3df5baf4d067da4749b046"},
- {file = "kiwisolver-1.4.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:fd628e63ffdba0112e3ddf1b1e9f3db29dd8262345138e08f4938acbc6d0805a"},
- {file = "kiwisolver-1.4.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:22ccba48abae827a0f952a78a7b1a7ff01866131e5bbe1f826ce9bda406bf051"},
- {file = "kiwisolver-1.4.3-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:af24b21c2283ca69c416a8a42cde9764dc36c63d3389645d28c69b0e93db3cd7"},
- {file = "kiwisolver-1.4.3-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:547111ef7cf13d73546c2de97ce434935626c897bdec96a578ca100b5fcd694b"},
- {file = "kiwisolver-1.4.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84f85adfebd7d3c3db649efdf73659e1677a2cf3fa6e2556a3f373578af14bf7"},
- {file = "kiwisolver-1.4.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ffd7cf165ff71afb202b3f36daafbf298932bee325aac9f58e1c9cd55838bef0"},
- {file = "kiwisolver-1.4.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6b3136eecf7e1b4a4d23e4b19d6c4e7a8e0b42d55f30444e3c529700cdacaa0d"},
- {file = "kiwisolver-1.4.3-cp39-cp39-win32.whl", hash = "sha256:46c6e5018ba31d5ee7582f323d8661498a154dea1117486a571db4c244531f24"},
- {file = "kiwisolver-1.4.3-cp39-cp39-win_amd64.whl", hash = "sha256:8395064d63b26947fa2c9faeea9c3eee35e52148c5339c37987e1d96fbf009b3"},
- {file = "kiwisolver-1.4.3-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:325fa1b15098e44fe4590a6c5c09a212ca10c6ebb5d96f7447d675f6c8340e4e"},
- {file = "kiwisolver-1.4.3-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:654280c5f41831ddcc5a331c0e3ce2e480bbc3d7c93c18ecf6236313aae2d61a"},
- {file = "kiwisolver-1.4.3-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1ae7aa0784aeadfbd693c27993727792fbe1455b84d49970bad5886b42976b18"},
- {file = "kiwisolver-1.4.3-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:130c6c35eded399d3967cf8a542c20b671f5ba85bd6f210f8b939f868360e9eb"},
- {file = "kiwisolver-1.4.3.tar.gz", hash = "sha256:ab8a15c2750ae8d53e31f77a94f846d0a00772240f1c12817411fa2344351f86"},
-]
+kiwisolver = []
lightning-flash = []
markdown = []
markdown-it-py = []
-markupsafe = [
- {file = "MarkupSafe-2.1.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:86b1f75c4e7c2ac2ccdaec2b9022845dbb81880ca318bb7a0a01fbf7813e3812"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f121a1420d4e173a5d96e47e9a0c0dcff965afdf1626d28de1460815f7c4ee7a"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a49907dd8420c5685cfa064a1335b6754b74541bbb3706c259c02ed65b644b3e"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10c1bfff05d95783da83491be968e8fe789263689c02724e0c691933c52994f5"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b7bd98b796e2b6553da7225aeb61f447f80a1ca64f41d83612e6139ca5213aa4"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b09bf97215625a311f669476f44b8b318b075847b49316d3e28c08e41a7a573f"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:694deca8d702d5db21ec83983ce0bb4b26a578e71fbdbd4fdcd387daa90e4d5e"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:efc1913fd2ca4f334418481c7e595c00aad186563bbc1ec76067848c7ca0a933"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-win32.whl", hash = "sha256:4a33dea2b688b3190ee12bd7cfa29d39c9ed176bda40bfa11099a3ce5d3a7ac6"},
- {file = "MarkupSafe-2.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:dda30ba7e87fbbb7eab1ec9f58678558fd9a6b8b853530e176eabd064da81417"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:671cd1187ed5e62818414afe79ed29da836dde67166a9fac6d435873c44fdd02"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3799351e2336dc91ea70b034983ee71cf2f9533cdff7c14c90ea126bfd95d65a"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e72591e9ecd94d7feb70c1cbd7be7b3ebea3f548870aa91e2732960fa4d57a37"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6fbf47b5d3728c6aea2abb0589b5d30459e369baa772e0f37a0320185e87c980"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:d5ee4f386140395a2c818d149221149c54849dfcfcb9f1debfe07a8b8bd63f9a"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:bcb3ed405ed3222f9904899563d6fc492ff75cce56cba05e32eff40e6acbeaa3"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:e1c0b87e09fa55a220f058d1d49d3fb8df88fbfab58558f1198e08c1e1de842a"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-win32.whl", hash = "sha256:8dc1c72a69aa7e082593c4a203dcf94ddb74bb5c8a731e4e1eb68d031e8498ff"},
- {file = "MarkupSafe-2.1.1-cp37-cp37m-win_amd64.whl", hash = "sha256:97a68e6ada378df82bc9f16b800ab77cbf4b2fada0081794318520138c088e4a"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e8c843bbcda3a2f1e3c2ab25913c80a3c5376cd00c6e8c4a86a89a28c8dc5452"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0212a68688482dc52b2d45013df70d169f542b7394fc744c02a57374a4207003"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e576a51ad59e4bfaac456023a78f6b5e6e7651dcd383bcc3e18d06f9b55d6d1"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b9fe39a2ccc108a4accc2676e77da025ce383c108593d65cc909add5c3bd601"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:96e37a3dc86e80bf81758c152fe66dbf60ed5eca3d26305edf01892257049925"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:6d0072fea50feec76a4c418096652f2c3238eaa014b2f94aeb1d56a66b41403f"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:089cf3dbf0cd6c100f02945abeb18484bd1ee57a079aefd52cffd17fba910b88"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6a074d34ee7a5ce3effbc526b7083ec9731bb3cbf921bbe1d3005d4d2bdb3a63"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-win32.whl", hash = "sha256:421be9fbf0ffe9ffd7a378aafebbf6f4602d564d34be190fc19a193232fd12b1"},
- {file = "MarkupSafe-2.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:fc7b548b17d238737688817ab67deebb30e8073c95749d55538ed473130ec0c7"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:e04e26803c9c3851c931eac40c695602c6295b8d432cbe78609649ad9bd2da8a"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b87db4360013327109564f0e591bd2a3b318547bcef31b468a92ee504d07ae4f"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99a2a507ed3ac881b975a2976d59f38c19386d128e7a9a18b7df6fff1fd4c1d6"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:56442863ed2b06d19c37f94d999035e15ee982988920e12a5b4ba29b62ad1f77"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3ce11ee3f23f79dbd06fb3d63e2f6af7b12db1d46932fe7bd8afa259a5996603"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:33b74d289bd2f5e527beadcaa3f401e0df0a89927c1559c8566c066fa4248ab7"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:43093fb83d8343aac0b1baa75516da6092f58f41200907ef92448ecab8825135"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8e3dcf21f367459434c18e71b2a9532d96547aef8a871872a5bd69a715c15f96"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-win32.whl", hash = "sha256:d4306c36ca495956b6d568d276ac11fdd9c30a36f1b6eb928070dc5360b22e1c"},
- {file = "MarkupSafe-2.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:46d00d6cfecdde84d40e572d63735ef81423ad31184100411e6e3388d405e247"},
- {file = "MarkupSafe-2.1.1.tar.gz", hash = "sha256:7f91197cc9e48f989d12e4e6fbc46495c446636dfc81b9ccf50bb0ec74b91d4b"},
-]
-matplotlib = [
- {file = "matplotlib-3.5.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:03bbb3f5f78836855e127b5dab228d99551ad0642918ccbf3067fcd52ac7ac5e"},
- {file = "matplotlib-3.5.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:49a5938ed6ef9dda560f26ea930a2baae11ea99e1c2080c8714341ecfda72a89"},
- {file = "matplotlib-3.5.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:77157be0fc4469cbfb901270c205e7d8adb3607af23cef8bd11419600647ceed"},
- {file = "matplotlib-3.5.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5844cea45d804174bf0fac219b4ab50774e504bef477fc10f8f730ce2d623441"},
- {file = "matplotlib-3.5.2-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c87973ddec10812bddc6c286b88fdd654a666080fbe846a1f7a3b4ba7b11ab78"},
- {file = "matplotlib-3.5.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a05f2b37222319753a5d43c0a4fd97ed4ff15ab502113e3f2625c26728040cf"},
- {file = "matplotlib-3.5.2-cp310-cp310-win32.whl", hash = "sha256:9776e1a10636ee5f06ca8efe0122c6de57ffe7e8c843e0fb6e001e9d9256ec95"},
- {file = "matplotlib-3.5.2-cp310-cp310-win_amd64.whl", hash = "sha256:b4fedaa5a9aa9ce14001541812849ed1713112651295fdddd640ea6620e6cf98"},
- {file = "matplotlib-3.5.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:ee175a571e692fc8ae8e41ac353c0e07259113f4cb063b0ec769eff9717e84bb"},
- {file = "matplotlib-3.5.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e8bda1088b941ead50caabd682601bece983cadb2283cafff56e8fcddbf7d7f"},
- {file = "matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9480842d5aadb6e754f0b8f4ebeb73065ac8be1855baa93cd082e46e770591e9"},
- {file = "matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6c623b355d605a81c661546af7f24414165a8a2022cddbe7380a31a4170fa2e9"},
- {file = "matplotlib-3.5.2-cp37-cp37m-win32.whl", hash = "sha256:a91426ae910819383d337ba0dc7971c7cefdaa38599868476d94389a329e599b"},
- {file = "matplotlib-3.5.2-cp37-cp37m-win_amd64.whl", hash = "sha256:c4b82c2ae6d305fcbeb0eb9c93df2602ebd2f174f6e8c8a5d92f9445baa0c1d3"},
- {file = "matplotlib-3.5.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ebc27ad11df3c1661f4677a7762e57a8a91dd41b466c3605e90717c9a5f90c82"},
- {file = "matplotlib-3.5.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:5a32ea6e12e80dedaca2d4795d9ed40f97bfa56e6011e14f31502fdd528b9c89"},
- {file = "matplotlib-3.5.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2a0967d4156adbd0d46db06bc1a877f0370bce28d10206a5071f9ecd6dc60b79"},
- {file = "matplotlib-3.5.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e2b696699386766ef171a259d72b203a3c75d99d03ec383b97fc2054f52e15cf"},
- {file = "matplotlib-3.5.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:7f409716119fa39b03da3d9602bd9b41142fab7a0568758cd136cd80b1bf36c8"},
- {file = "matplotlib-3.5.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:b8d3f4e71e26307e8c120b72c16671d70c5cd08ae412355c11254aa8254fb87f"},
- {file = "matplotlib-3.5.2-cp38-cp38-win32.whl", hash = "sha256:b6c63cd01cad0ea8704f1fd586e9dc5777ccedcd42f63cbbaa3eae8dd41172a1"},
- {file = "matplotlib-3.5.2-cp38-cp38-win_amd64.whl", hash = "sha256:75c406c527a3aa07638689586343f4b344fcc7ab1f79c396699eb550cd2b91f7"},
- {file = "matplotlib-3.5.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:4a44cdfdb9d1b2f18b1e7d315eb3843abb097869cd1ef89cfce6a488cd1b5182"},
- {file = "matplotlib-3.5.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3d8e129af95b156b41cb3be0d9a7512cc6d73e2b2109f82108f566dbabdbf377"},
- {file = "matplotlib-3.5.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:364e6bca34edc10a96aa3b1d7cd76eb2eea19a4097198c1b19e89bee47ed5781"},
- {file = "matplotlib-3.5.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea75df8e567743207e2b479ba3d8843537be1c146d4b1e3e395319a4e1a77fe9"},
- {file = "matplotlib-3.5.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:44c6436868186564450df8fd2fc20ed9daaef5caad699aa04069e87099f9b5a8"},
- {file = "matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:7d7705022df2c42bb02937a2a824f4ec3cca915700dd80dc23916af47ff05f1a"},
- {file = "matplotlib-3.5.2-cp39-cp39-win32.whl", hash = "sha256:ee0b8e586ac07f83bb2950717e66cb305e2859baf6f00a9c39cc576e0ce9629c"},
- {file = "matplotlib-3.5.2-cp39-cp39-win_amd64.whl", hash = "sha256:c772264631e5ae61f0bd41313bbe48e1b9bcc95b974033e1118c9caa1a84d5c6"},
- {file = "matplotlib-3.5.2-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:751d3815b555dcd6187ad35b21736dc12ce6925fc3fa363bbc6dc0f86f16484f"},
- {file = "matplotlib-3.5.2-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:31fbc2af27ebb820763f077ec7adc79b5a031c2f3f7af446bd7909674cd59460"},
- {file = "matplotlib-3.5.2-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4fa28ca76ac5c2b2d54bc058b3dad8e22ee85d26d1ee1b116a6fd4d2277b6a04"},
- {file = "matplotlib-3.5.2-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:24173c23d1bcbaed5bf47b8785d27933a1ac26a5d772200a0f3e0e38f471b001"},
- {file = "matplotlib-3.5.2.tar.gz", hash = "sha256:48cf850ce14fa18067f2d9e0d646763681948487a8080ec0af2686468b4607a2"},
-]
-mccabe = [
- {file = "mccabe-0.6.1-py2.py3-none-any.whl", hash = "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42"},
- {file = "mccabe-0.6.1.tar.gz", hash = "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"},
-]
+markupsafe = []
+matplotlib = []
+mccabe = []
mdit-py-plugins = []
mdurl = []
mergedeep = []
-mistune = [
- {file = "mistune-0.8.4-py2.py3-none-any.whl", hash = "sha256:88a1051873018da288eee8538d476dffe1262495144b33ecb586c4ab266bb8d4"},
- {file = "mistune-0.8.4.tar.gz", hash = "sha256:59a3429db53c50b5c6bcc8a07f8848cb00d7dc8bdb431a4ab41920d201d4756e"},
-]
+mistune = []
mkdocs = []
mkdocs-autorefs = []
mkdocs-jupyter = []
@@ -2494,672 +2162,83 @@ mkdocs-material-extensions = []
mkdocstrings = []
mkdocstrings-python = []
mkdocstrings-python-legacy = []
-multidict = [
- {file = "multidict-6.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b9e95a740109c6047602f4db4da9949e6c5945cefbad34a1299775ddc9a62e2"},
- {file = "multidict-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac0e27844758d7177989ce406acc6a83c16ed4524ebc363c1f748cba184d89d3"},
- {file = "multidict-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:041b81a5f6b38244b34dc18c7b6aba91f9cdaf854d9a39e5ff0b58e2b5773b9c"},
- {file = "multidict-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5fdda29a3c7e76a064f2477c9aab1ba96fd94e02e386f1e665bca1807fc5386f"},
- {file = "multidict-6.0.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3368bf2398b0e0fcbf46d85795adc4c259299fec50c1416d0f77c0a843a3eed9"},
- {file = "multidict-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f4f052ee022928d34fe1f4d2bc743f32609fb79ed9c49a1710a5ad6b2198db20"},
- {file = "multidict-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:225383a6603c086e6cef0f2f05564acb4f4d5f019a4e3e983f572b8530f70c88"},
- {file = "multidict-6.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:50bd442726e288e884f7be9071016c15a8742eb689a593a0cac49ea093eef0a7"},
- {file = "multidict-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:47e6a7e923e9cada7c139531feac59448f1f47727a79076c0b1ee80274cd8eee"},
- {file = "multidict-6.0.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:0556a1d4ea2d949efe5fd76a09b4a82e3a4a30700553a6725535098d8d9fb672"},
- {file = "multidict-6.0.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:626fe10ac87851f4cffecee161fc6f8f9853f0f6f1035b59337a51d29ff3b4f9"},
- {file = "multidict-6.0.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:8064b7c6f0af936a741ea1efd18690bacfbae4078c0c385d7c3f611d11f0cf87"},
- {file = "multidict-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:2d36e929d7f6a16d4eb11b250719c39560dd70545356365b494249e2186bc389"},
- {file = "multidict-6.0.2-cp310-cp310-win32.whl", hash = "sha256:fcb91630817aa8b9bc4a74023e4198480587269c272c58b3279875ed7235c293"},
- {file = "multidict-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:8cbf0132f3de7cc6c6ce00147cc78e6439ea736cee6bca4f068bcf892b0fd658"},
- {file = "multidict-6.0.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:05f6949d6169878a03e607a21e3b862eaf8e356590e8bdae4227eedadacf6e51"},
- {file = "multidict-6.0.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e2c2e459f7050aeb7c1b1276763364884595d47000c1cddb51764c0d8976e608"},
- {file = "multidict-6.0.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d0509e469d48940147e1235d994cd849a8f8195e0bca65f8f5439c56e17872a3"},
- {file = "multidict-6.0.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:514fe2b8d750d6cdb4712346a2c5084a80220821a3e91f3f71eec11cf8d28fd4"},
- {file = "multidict-6.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:19adcfc2a7197cdc3987044e3f415168fc5dc1f720c932eb1ef4f71a2067e08b"},
- {file = "multidict-6.0.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b9d153e7f1f9ba0b23ad1568b3b9e17301e23b042c23870f9ee0522dc5cc79e8"},
- {file = "multidict-6.0.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:aef9cc3d9c7d63d924adac329c33835e0243b5052a6dfcbf7732a921c6e918ba"},
- {file = "multidict-6.0.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:4571f1beddff25f3e925eea34268422622963cd8dc395bb8778eb28418248e43"},
- {file = "multidict-6.0.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:d48b8ee1d4068561ce8033d2c344cf5232cb29ee1a0206a7b828c79cbc5982b8"},
- {file = "multidict-6.0.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:45183c96ddf61bf96d2684d9fbaf6f3564d86b34cb125761f9a0ef9e36c1d55b"},
- {file = "multidict-6.0.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:75bdf08716edde767b09e76829db8c1e5ca9d8bb0a8d4bd94ae1eafe3dac5e15"},
- {file = "multidict-6.0.2-cp37-cp37m-win32.whl", hash = "sha256:a45e1135cb07086833ce969555df39149680e5471c04dfd6a915abd2fc3f6dbc"},
- {file = "multidict-6.0.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6f3cdef8a247d1eafa649085812f8a310e728bdf3900ff6c434eafb2d443b23a"},
- {file = "multidict-6.0.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:0327292e745a880459ef71be14e709aaea2f783f3537588fb4ed09b6c01bca60"},
- {file = "multidict-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:e875b6086e325bab7e680e4316d667fc0e5e174bb5611eb16b3ea121c8951b86"},
- {file = "multidict-6.0.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:feea820722e69451743a3d56ad74948b68bf456984d63c1a92e8347b7b88452d"},
- {file = "multidict-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9cc57c68cb9139c7cd6fc39f211b02198e69fb90ce4bc4a094cf5fe0d20fd8b0"},
- {file = "multidict-6.0.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:497988d6b6ec6ed6f87030ec03280b696ca47dbf0648045e4e1d28b80346560d"},
- {file = "multidict-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:89171b2c769e03a953d5969b2f272efa931426355b6c0cb508022976a17fd376"},
- {file = "multidict-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:684133b1e1fe91eda8fa7447f137c9490a064c6b7f392aa857bba83a28cfb693"},
- {file = "multidict-6.0.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fd9fc9c4849a07f3635ccffa895d57abce554b467d611a5009ba4f39b78a8849"},
- {file = "multidict-6.0.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:e07c8e79d6e6fd37b42f3250dba122053fddb319e84b55dd3a8d6446e1a7ee49"},
- {file = "multidict-6.0.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:4070613ea2227da2bfb2c35a6041e4371b0af6b0be57f424fe2318b42a748516"},
- {file = "multidict-6.0.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:47fbeedbf94bed6547d3aa632075d804867a352d86688c04e606971595460227"},
- {file = "multidict-6.0.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:5774d9218d77befa7b70d836004a768fb9aa4fdb53c97498f4d8d3f67bb9cfa9"},
- {file = "multidict-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2957489cba47c2539a8eb7ab32ff49101439ccf78eab724c828c1a54ff3ff98d"},
- {file = "multidict-6.0.2-cp38-cp38-win32.whl", hash = "sha256:e5b20e9599ba74391ca0cfbd7b328fcc20976823ba19bc573983a25b32e92b57"},
- {file = "multidict-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:8004dca28e15b86d1b1372515f32eb6f814bdf6f00952699bdeb541691091f96"},
- {file = "multidict-6.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:2e4a0785b84fb59e43c18a015ffc575ba93f7d1dbd272b4cdad9f5134b8a006c"},
- {file = "multidict-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6701bf8a5d03a43375909ac91b6980aea74b0f5402fbe9428fc3f6edf5d9677e"},
- {file = "multidict-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a007b1638e148c3cfb6bf0bdc4f82776cef0ac487191d093cdc316905e504071"},
- {file = "multidict-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:07a017cfa00c9890011628eab2503bee5872f27144936a52eaab449be5eaf032"},
- {file = "multidict-6.0.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c207fff63adcdf5a485969131dc70e4b194327666b7e8a87a97fbc4fd80a53b2"},
- {file = "multidict-6.0.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:373ba9d1d061c76462d74e7de1c0c8e267e9791ee8cfefcf6b0b2495762c370c"},
- {file = "multidict-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfba7c6d5d7c9099ba21f84662b037a0ffd4a5e6b26ac07d19e423e6fdf965a9"},
- {file = "multidict-6.0.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:19d9bad105dfb34eb539c97b132057a4e709919ec4dd883ece5838bcbf262b80"},
- {file = "multidict-6.0.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:de989b195c3d636ba000ee4281cd03bb1234635b124bf4cd89eeee9ca8fcb09d"},
- {file = "multidict-6.0.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7c40b7bbece294ae3a87c1bc2abff0ff9beef41d14188cda94ada7bcea99b0fb"},
- {file = "multidict-6.0.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:d16cce709ebfadc91278a1c005e3c17dd5f71f5098bfae1035149785ea6e9c68"},
- {file = "multidict-6.0.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:a2c34a93e1d2aa35fbf1485e5010337c72c6791407d03aa5f4eed920343dd360"},
- {file = "multidict-6.0.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:feba80698173761cddd814fa22e88b0661e98cb810f9f986c54aa34d281e4937"},
- {file = "multidict-6.0.2-cp39-cp39-win32.whl", hash = "sha256:23b616fdc3c74c9fe01d76ce0d1ce872d2d396d8fa8e4899398ad64fb5aa214a"},
- {file = "multidict-6.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:4bae31803d708f6f15fd98be6a6ac0b6958fcf68fda3c77a048a4f9073704aae"},
- {file = "multidict-6.0.2.tar.gz", hash = "sha256:5ff3bd75f38e4c43f1f470f2df7a4d430b821c4ce22be384e1459cb57d6bb013"},
-]
-multiprocess = [
- {file = "multiprocess-0.70.13-cp27-cp27m-macosx_10_14_x86_64.whl", hash = "sha256:b9a3be43ecee6776a9e7223af96914a0164f306affcf4624b213885172236b77"},
- {file = "multiprocess-0.70.13-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:7e6a689da3490412caa7b3e27c3385d8aaa49135f3a353ace94ca47e4c926d37"},
- {file = "multiprocess-0.70.13-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:17cb4229aa43e6973679d67c66a454cbf8b6b0d038425cba3220ea5a06d61b58"},
- {file = "multiprocess-0.70.13-cp27-cp27m-win32.whl", hash = "sha256:99bb68dd0d5b3d30fe104721bee26e4637667112d5951b51feb81479fd560876"},
- {file = "multiprocess-0.70.13-cp27-cp27m-win_amd64.whl", hash = "sha256:6cdde49defcb933062df382ebc9b5299beebcd157a98b3a65291c1c94a2edc41"},
- {file = "multiprocess-0.70.13-pp27-pypy_73-macosx_10_7_x86_64.whl", hash = "sha256:92003c247436f8699b7692e95346a238446710f078500eb364bc23bb0503dd4f"},
- {file = "multiprocess-0.70.13-pp27-pypy_73-manylinux2010_x86_64.whl", hash = "sha256:3ec1c8015e19182bfa01b5887a9c25805c48df3c71863f48fe83803147cde5d6"},
- {file = "multiprocess-0.70.13-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:b7415f61bddfffdade73396904551be8124a4a363322aa9c72d42e349c5fca39"},
- {file = "multiprocess-0.70.13-pp37-pypy37_pp73-manylinux_2_24_i686.whl", hash = "sha256:5436d1cd9f901f7ddc4f20b6fd0b462c87dcc00d941cc13eeb2401fc5bd00e42"},
- {file = "multiprocess-0.70.13-pp37-pypy37_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:34e9703bd5b9fee5455c93a74e44dbabe55481c214d03be1e65f037be9d0c520"},
- {file = "multiprocess-0.70.13-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:af0a48440aa8f793d8bb100f20102c12f192de5a608638819a998f2cc59e1fcd"},
- {file = "multiprocess-0.70.13-pp38-pypy38_pp73-manylinux_2_24_i686.whl", hash = "sha256:c4a97216e8319039c69a266252cc68a392b96f9e67e3ed02ad88be9e6f2d2969"},
- {file = "multiprocess-0.70.13-pp38-pypy38_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:48315eefe02c35dd7560da3fa8af66d9f4a61b9dc8f7c40801c5f972ab4604b1"},
- {file = "multiprocess-0.70.13-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:5a6dca5f29f0224c855d0d5cad963476175cfc8de112d3eebe85914cb735f130"},
- {file = "multiprocess-0.70.13-pp39-pypy39_pp73-manylinux_2_24_i686.whl", hash = "sha256:5974bdad390ba466cc130288d2ef1048fdafedd01cf4641fc024f6088af70bfe"},
- {file = "multiprocess-0.70.13-pp39-pypy39_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:01c1137d2f18d0cd262d0fdb7294b1fe9fc3e8dc8b126e506085434ae8eb3677"},
- {file = "multiprocess-0.70.13-py310-none-any.whl", hash = "sha256:0f4faf4811019efdb2f91db09240f893ee40cbfcb06978f3b8ed8c248e73babe"},
- {file = "multiprocess-0.70.13-py37-none-any.whl", hash = "sha256:62e556a0c31ec7176e28aa331663ac26c276ee3536b5e9bb5e850681e7a00f11"},
- {file = "multiprocess-0.70.13-py38-none-any.whl", hash = "sha256:7be9e320a41d2d0d0eddacfe693cfb07b4cb9c0d3d10007f4304255c15215778"},
- {file = "multiprocess-0.70.13-py39-none-any.whl", hash = "sha256:00ef48461d43d1e30f8f4b2e1b287ecaaffec325a37053beb5503e0d69e5a3cd"},
- {file = "multiprocess-0.70.13.tar.gz", hash = "sha256:2e096dd618a84d15aa369a9cf6695815e5539f853dc8fa4f4b9153b11b1d0b32"},
-]
+multidict = []
+multiprocess = []
mypy = []
-mypy-extensions = [
- {file = "mypy_extensions-0.4.3-py2.py3-none-any.whl", hash = "sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d"},
- {file = "mypy_extensions-0.4.3.tar.gz", hash = "sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"},
-]
+mypy-extensions = []
nbclient = []
-nbconvert = [
- {file = "nbconvert-6.5.0-py3-none-any.whl", hash = "sha256:c56dd0b8978a1811a5654f74c727ff16ca87dd5a43abd435a1c49b840fcd8360"},
- {file = "nbconvert-6.5.0.tar.gz", hash = "sha256:223e46e27abe8596b8aed54301fadbba433b7ffea8196a68fd7b1ff509eee99d"},
-]
-nbformat = [
- {file = "nbformat-5.4.0-py3-none-any.whl", hash = "sha256:0d6072aaec95dddc39735c144ee8bbc6589c383fb462e4058abc855348152dad"},
- {file = "nbformat-5.4.0.tar.gz", hash = "sha256:44ba5ca6acb80c5d5a500f1e5b83ede8cbe364d5a495c4c8cf60aaf1ba656501"},
-]
-nest-asyncio = [
- {file = "nest_asyncio-1.5.5-py3-none-any.whl", hash = "sha256:b98e3ec1b246135e4642eceffa5a6c23a3ab12c82ff816a92c612d68205813b2"},
- {file = "nest_asyncio-1.5.5.tar.gz", hash = "sha256:e442291cd942698be619823a17a86a5759eabe1f8613084790de189fe9e16d65"},
-]
+nbconvert = []
+nbformat = []
+nest-asyncio = []
numpy = []
oauthlib = []
-packaging = [
- {file = "packaging-21.3-py3-none-any.whl", hash = "sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522"},
- {file = "packaging-21.3.tar.gz", hash = "sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb"},
-]
+packaging = []
pandas = []
-pandocfilters = [
- {file = "pandocfilters-1.5.0-py2.py3-none-any.whl", hash = "sha256:33aae3f25fd1a026079f5d27bdd52496f0e0803b3469282162bafdcbdf6ef14f"},
- {file = "pandocfilters-1.5.0.tar.gz", hash = "sha256:0b679503337d233b4339a817bfc8c50064e2eff681314376a47cb582305a7a38"},
-]
-pathspec = [
- {file = "pathspec-0.9.0-py2.py3-none-any.whl", hash = "sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a"},
- {file = "pathspec-0.9.0.tar.gz", hash = "sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1"},
-]
+pandocfilters = []
+pathspec = []
pbr = []
pillow = []
-platformdirs = [
- {file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
- {file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
-]
-pluggy = [
- {file = "pluggy-1.0.0-py2.py3-none-any.whl", hash = "sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3"},
- {file = "pluggy-1.0.0.tar.gz", hash = "sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159"},
-]
+platformdirs = []
+pluggy = []
protobuf = []
-py = [
- {file = "py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"},
- {file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"},
-]
-pyarrow = [
- {file = "pyarrow-8.0.0-cp310-cp310-macosx_10_13_universal2.whl", hash = "sha256:d5ef4372559b191cafe7db8932801eee252bfc35e983304e7d60b6954576a071"},
- {file = "pyarrow-8.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:863be6bad6c53797129610930794a3e797cb7d41c0a30e6794a2ac0e42ce41b8"},
- {file = "pyarrow-8.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:69b043a3fce064ebd9fbae6abc30e885680296e5bd5e6f7353e6a87966cf2ad7"},
- {file = "pyarrow-8.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:51e58778fcb8829fca37fbfaea7f208d5ce7ea89ea133dd13d8ce745278ee6f0"},
- {file = "pyarrow-8.0.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:15511ce2f50343f3fd5e9f7c30e4d004da9134e9597e93e9c96c3985928cbe82"},
- {file = "pyarrow-8.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea132067ec712d1b1116a841db1c95861508862b21eddbcafefbce8e4b96b867"},
- {file = "pyarrow-8.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:deb400df8f19a90b662babceb6dd12daddda6bb357c216e558b207c0770c7654"},
- {file = "pyarrow-8.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:3bd201af6e01f475f02be88cf1f6ee9856ab98c11d8bbb6f58347c58cd07be00"},
- {file = "pyarrow-8.0.0-cp37-cp37m-macosx_10_13_x86_64.whl", hash = "sha256:78a6ac39cd793582998dac88ab5c1c1dd1e6503df6672f064f33a21937ec1d8d"},
- {file = "pyarrow-8.0.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:d6f1e1040413651819074ef5b500835c6c42e6c446532a1ddef8bc5054e8dba5"},
- {file = "pyarrow-8.0.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:98c13b2e28a91b0fbf24b483df54a8d7814c074c2623ecef40dce1fa52f6539b"},
- {file = "pyarrow-8.0.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c9c97c8e288847e091dfbcdf8ce51160e638346f51919a9e74fe038b2e8aee62"},
- {file = "pyarrow-8.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:edad25522ad509e534400d6ab98cf1872d30c31bc5e947712bfd57def7af15bb"},
- {file = "pyarrow-8.0.0-cp37-cp37m-win_amd64.whl", hash = "sha256:ece333706a94c1221ced8b299042f85fd88b5db802d71be70024433ddf3aecab"},
- {file = "pyarrow-8.0.0-cp38-cp38-macosx_10_13_x86_64.whl", hash = "sha256:95c7822eb37663e073da9892f3499fe28e84f3464711a3e555e0c5463fd53a19"},
- {file = "pyarrow-8.0.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:25a5f7c7f36df520b0b7363ba9f51c3070799d4b05d587c60c0adaba57763479"},
- {file = "pyarrow-8.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ce64bc1da3109ef5ab9e4c60316945a7239c798098a631358e9ab39f6e5529e9"},
- {file = "pyarrow-8.0.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:541e7845ce5f27a861eb5b88ee165d931943347eec17b9ff1e308663531c9647"},
- {file = "pyarrow-8.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8cd86e04a899bef43e25184f4b934584861d787cf7519851a8c031803d45c6d8"},
- {file = "pyarrow-8.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba2b7aa7efb59156b87987a06f5241932914e4d5bbb74a465306b00a6c808849"},
- {file = "pyarrow-8.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:42b7982301a9ccd06e1dd4fabd2e8e5df74b93ce4c6b87b81eb9e2d86dc79871"},
- {file = "pyarrow-8.0.0-cp39-cp39-macosx_10_13_universal2.whl", hash = "sha256:1dd482ccb07c96188947ad94d7536ab696afde23ad172df8e18944ec79f55055"},
- {file = "pyarrow-8.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:81b87b782a1366279411f7b235deab07c8c016e13f9af9f7c7b0ee564fedcc8f"},
- {file = "pyarrow-8.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:03a10daad957970e914920b793f6a49416699e791f4c827927fd4e4d892a5d16"},
- {file = "pyarrow-8.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:65c7f4cc2be195e3db09296d31a654bb6d8786deebcab00f0e2455fd109d7456"},
- {file = "pyarrow-8.0.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3fee786259d986f8c046100ced54d63b0c8c9f7cdb7d1bbe07dc69e0f928141c"},
- {file = "pyarrow-8.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ea2c54e6b5ecd64e8299d2abb40770fe83a718f5ddc3825ddd5cd28e352cce1"},
- {file = "pyarrow-8.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8392b9a1e837230090fe916415ed4c3433b2ddb1a798e3f6438303c70fbabcfc"},
- {file = "pyarrow-8.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:cb06cacc19f3b426681f2f6803cc06ff481e7fe5b3a533b406bc5b2138843d4f"},
- {file = "pyarrow-8.0.0.tar.gz", hash = "sha256:4a18a211ed888f1ac0b0ebcb99e2d9a3e913a481120ee9b1fe33d3fedb945d4e"},
-]
+py = []
+pyarrow = []
pyasn1 = []
pyasn1-modules = []
pycodestyle = []
-pycparser = [
- {file = "pycparser-2.21-py2.py3-none-any.whl", hash = "sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9"},
- {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
-]
+pycparser = []
pydeprecate = []
pyflakes = []
-pygments = [
- {file = "Pygments-2.12.0-py3-none-any.whl", hash = "sha256:dc9c10fb40944260f6ed4c688ece0cd2048414940f1cea51b8b226318411c519"},
- {file = "Pygments-2.12.0.tar.gz", hash = "sha256:5eb116118f9612ff1ee89ac96437bb6b49e8f04d8a13b514ba26f620208e26eb"},
-]
+pygments = []
pymdown-extensions = []
-pyparsing = [
- {file = "pyparsing-3.0.9-py3-none-any.whl", hash = "sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc"},
- {file = "pyparsing-3.0.9.tar.gz", hash = "sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb"},
-]
-pyrsistent = [
- {file = "pyrsistent-0.18.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:df46c854f490f81210870e509818b729db4488e1f30f2a1ce1698b2295a878d1"},
- {file = "pyrsistent-0.18.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5d45866ececf4a5fff8742c25722da6d4c9e180daa7b405dc0a2a2790d668c26"},
- {file = "pyrsistent-0.18.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4ed6784ceac462a7d6fcb7e9b663e93b9a6fb373b7f43594f9ff68875788e01e"},
- {file = "pyrsistent-0.18.1-cp310-cp310-win32.whl", hash = "sha256:e4f3149fd5eb9b285d6bfb54d2e5173f6a116fe19172686797c056672689daf6"},
- {file = "pyrsistent-0.18.1-cp310-cp310-win_amd64.whl", hash = "sha256:636ce2dc235046ccd3d8c56a7ad54e99d5c1cd0ef07d9ae847306c91d11b5fec"},
- {file = "pyrsistent-0.18.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e92a52c166426efbe0d1ec1332ee9119b6d32fc1f0bbfd55d5c1088070e7fc1b"},
- {file = "pyrsistent-0.18.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d7a096646eab884bf8bed965bad63ea327e0d0c38989fc83c5ea7b8a87037bfc"},
- {file = "pyrsistent-0.18.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cdfd2c361b8a8e5d9499b9082b501c452ade8bbf42aef97ea04854f4a3f43b22"},
- {file = "pyrsistent-0.18.1-cp37-cp37m-win32.whl", hash = "sha256:7ec335fc998faa4febe75cc5268a9eac0478b3f681602c1f27befaf2a1abe1d8"},
- {file = "pyrsistent-0.18.1-cp37-cp37m-win_amd64.whl", hash = "sha256:6455fc599df93d1f60e1c5c4fe471499f08d190d57eca040c0ea182301321286"},
- {file = "pyrsistent-0.18.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:fd8da6d0124efa2f67d86fa70c851022f87c98e205f0594e1fae044e7119a5a6"},
- {file = "pyrsistent-0.18.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7bfe2388663fd18bd8ce7db2c91c7400bf3e1a9e8bd7d63bf7e77d39051b85ec"},
- {file = "pyrsistent-0.18.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0e3e1fcc45199df76053026a51cc59ab2ea3fc7c094c6627e93b7b44cdae2c8c"},
- {file = "pyrsistent-0.18.1-cp38-cp38-win32.whl", hash = "sha256:b568f35ad53a7b07ed9b1b2bae09eb15cdd671a5ba5d2c66caee40dbf91c68ca"},
- {file = "pyrsistent-0.18.1-cp38-cp38-win_amd64.whl", hash = "sha256:d1b96547410f76078eaf66d282ddca2e4baae8964364abb4f4dcdde855cd123a"},
- {file = "pyrsistent-0.18.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:f87cc2863ef33c709e237d4b5f4502a62a00fab450c9e020892e8e2ede5847f5"},
- {file = "pyrsistent-0.18.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bc66318fb7ee012071b2792024564973ecc80e9522842eb4e17743604b5e045"},
- {file = "pyrsistent-0.18.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:914474c9f1d93080338ace89cb2acee74f4f666fb0424896fcfb8d86058bf17c"},
- {file = "pyrsistent-0.18.1-cp39-cp39-win32.whl", hash = "sha256:1b34eedd6812bf4d33814fca1b66005805d3640ce53140ab8bbb1e2651b0d9bc"},
- {file = "pyrsistent-0.18.1-cp39-cp39-win_amd64.whl", hash = "sha256:e24a828f57e0c337c8d8bb9f6b12f09dfdf0273da25fda9e314f0b684b415a07"},
- {file = "pyrsistent-0.18.1.tar.gz", hash = "sha256:d4d61f8b993a7255ba714df3aca52700f8125289f84f704cf80916517c46eb96"},
-]
+pyparsing = []
+pyrsistent = []
pytest = []
pytest-cov = []
pytest-mock = []
-python-dateutil = [
- {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
- {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
-]
+python-dateutil = []
pytkdocs = []
pytorch-lightning = []
-pytz = [
- {file = "pytz-2022.1-py2.py3-none-any.whl", hash = "sha256:e68985985296d9a66a881eb3193b0906246245294a881e7c8afe623866ac6a5c"},
- {file = "pytz-2022.1.tar.gz", hash = "sha256:1e760e2fe6a8163bc0b3d9a19c4f84342afa0a2affebfaa84b01b978a02ecaa7"},
-]
-pywin32 = [
- {file = "pywin32-304-cp310-cp310-win32.whl", hash = "sha256:3c7bacf5e24298c86314f03fa20e16558a4e4138fc34615d7de4070c23e65af3"},
- {file = "pywin32-304-cp310-cp310-win_amd64.whl", hash = "sha256:4f32145913a2447736dad62495199a8e280a77a0ca662daa2332acf849f0be48"},
- {file = "pywin32-304-cp310-cp310-win_arm64.whl", hash = "sha256:d3ee45adff48e0551d1aa60d2ec066fec006083b791f5c3527c40cd8aefac71f"},
- {file = "pywin32-304-cp311-cp311-win32.whl", hash = "sha256:30c53d6ce44c12a316a06c153ea74152d3b1342610f1b99d40ba2795e5af0269"},
- {file = "pywin32-304-cp311-cp311-win_amd64.whl", hash = "sha256:7ffa0c0fa4ae4077e8b8aa73800540ef8c24530057768c3ac57c609f99a14fd4"},
- {file = "pywin32-304-cp311-cp311-win_arm64.whl", hash = "sha256:cbbe34dad39bdbaa2889a424d28752f1b4971939b14b1bb48cbf0182a3bcfc43"},
- {file = "pywin32-304-cp36-cp36m-win32.whl", hash = "sha256:be253e7b14bc601718f014d2832e4c18a5b023cbe72db826da63df76b77507a1"},
- {file = "pywin32-304-cp36-cp36m-win_amd64.whl", hash = "sha256:de9827c23321dcf43d2f288f09f3b6d772fee11e809015bdae9e69fe13213988"},
- {file = "pywin32-304-cp37-cp37m-win32.whl", hash = "sha256:f64c0377cf01b61bd5e76c25e1480ca8ab3b73f0c4add50538d332afdf8f69c5"},
- {file = "pywin32-304-cp37-cp37m-win_amd64.whl", hash = "sha256:bb2ea2aa81e96eee6a6b79d87e1d1648d3f8b87f9a64499e0b92b30d141e76df"},
- {file = "pywin32-304-cp38-cp38-win32.whl", hash = "sha256:94037b5259701988954931333aafd39cf897e990852115656b014ce72e052e96"},
- {file = "pywin32-304-cp38-cp38-win_amd64.whl", hash = "sha256:ead865a2e179b30fb717831f73cf4373401fc62fbc3455a0889a7ddac848f83e"},
- {file = "pywin32-304-cp39-cp39-win32.whl", hash = "sha256:25746d841201fd9f96b648a248f731c1dec851c9a08b8e33da8b56148e4c65cc"},
- {file = "pywin32-304-cp39-cp39-win_amd64.whl", hash = "sha256:d24a3382f013b21aa24a5cfbfad5a2cd9926610c0affde3e8ab5b3d7dbcf4ac9"},
-]
+pytz = []
+pywin32 = []
pyyaml = []
pyyaml-env-tag = []
-pyzmq = [
- {file = "pyzmq-23.2.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:22ac0243a41798e3eb5d5714b28c2f28e3d10792dffbc8a5fca092f975fdeceb"},
- {file = "pyzmq-23.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f685003d836ad0e5d4f08d1e024ee3ac7816eb2f873b2266306eef858f058133"},
- {file = "pyzmq-23.2.0-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:d4651de7316ec8560afe430fb042c0782ed8ac54c0be43a515944d7c78fddac8"},
- {file = "pyzmq-23.2.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:bcc6953e47bcfc9028ddf9ab2a321a3c51d7cc969db65edec092019bb837959f"},
- {file = "pyzmq-23.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e08671dc202a1880fa522f921f35ca5925ba30da8bc96228d74a8f0643ead9c"},
- {file = "pyzmq-23.2.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:de727ea906033b30527b4a99498f19aca3f4d1073230a958679a5b726e2784e0"},
- {file = "pyzmq-23.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f5aa9da520e4bb8cee8189f2f541701405e7690745094ded7a37b425d60527ea"},
- {file = "pyzmq-23.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:f3ff6abde52e702397949054cb5b06c1c75b5d6542f6a2ce029e46f71ffbbbf2"},
- {file = "pyzmq-23.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e2e2db5c6ef376e97c912733dfc24406f5949474d03e800d5f07b6aca4d870af"},
- {file = "pyzmq-23.2.0-cp310-cp310-win32.whl", hash = "sha256:e669913cb2179507628419ec4f0e453e48ce6f924de5884d396f18c31836089c"},
- {file = "pyzmq-23.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:a3dc339f7bc185d5fd0fd976242a5baf35de404d467e056484def8a4dd95868b"},
- {file = "pyzmq-23.2.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:30c365e60c39c53f8eea042b37ea28304ffa6558fb7241cf278745095a5757da"},
- {file = "pyzmq-23.2.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c2d8b69a2bf239ae3d987537bf3fbc2b044a405394cf4c258fc684971dd48b2"},
- {file = "pyzmq-23.2.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:602835e5672ca9ca1d78e6c148fb28c4f91b748ebc41fbd2f479d8763d58bc9b"},
- {file = "pyzmq-23.2.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:831da96ba3f36cc892f0afbb4fb89b28b61b387261676e55d55a682addbd29f7"},
- {file = "pyzmq-23.2.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:c8dec8a2f3f0bb462e6439df436cd8c7ec37968e90b4209ac621e7fbc0ed3b00"},
- {file = "pyzmq-23.2.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:814e5aaf0c3be9991a59066eafb2d6e117aed6b413e3e7e9be45d4e55f5e2748"},
- {file = "pyzmq-23.2.0-cp36-cp36m-win32.whl", hash = "sha256:8496a2a5efd055c61ac2c6a18116c768a25c644b6747dcfde43e91620ab3453c"},
- {file = "pyzmq-23.2.0-cp36-cp36m-win_amd64.whl", hash = "sha256:60746a7e8558655420a69441c0a1d47ed225ed3ac355920b96a96d0554ef7e6b"},
- {file = "pyzmq-23.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5cb642e94337b0c76c9c8cb9bfb0f8a78654575847d080d3e1504f312d691fc3"},
- {file = "pyzmq-23.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:444f7d615d5f686d0ef508b9edfa8a286e6d89f449a1ba37b60ef69d869220a3"},
- {file = "pyzmq-23.2.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c9638e0057e3f1a8b7c5ce33c7575349d9183a033a19b5676ad55096ae36820b"},
- {file = "pyzmq-23.2.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:004a431dfa0459123e6f4660d7e3c4ac19217d134ca38bacfffb2e78716fe944"},
- {file = "pyzmq-23.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:5592fb4316f895922b1cacb91b04a0fa09d6f6f19bbab4442b4d0a0825177b93"},
- {file = "pyzmq-23.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:c0a5f987d73fd9b46c3d180891f829afda714ab6bab30a1218724d4a0a63afd8"},
- {file = "pyzmq-23.2.0-cp37-cp37m-win32.whl", hash = "sha256:d11628212fd731b8986f1561d9bb3f8c38d9c15b330c3d8a88963519fbcd553b"},
- {file = "pyzmq-23.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:558f5f636e3e65f261b64925e8b190e8689e334911595394572cc7523879006d"},
- {file = "pyzmq-23.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:61b97f624da42813f74977425a3a6144d604ea21cf065616d36ea3a866d92c1c"},
- {file = "pyzmq-23.2.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:693c96ae4d975eb8efa1639670e9b1fac0c3f98b7845b65c0f369141fb4bb21f"},
- {file = "pyzmq-23.2.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2b054525c9f7e240562185bf21671ca16d56bde92e9bd0f822c07dec7626b704"},
- {file = "pyzmq-23.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:859059caf564f0c9398c9005278055ed3d37af4d73de6b1597821193b04ca09b"},
- {file = "pyzmq-23.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8355744fdbdeac5cfadfa4f38b82029b5f2b8cab7472a33453a217a7f3a9dce2"},
- {file = "pyzmq-23.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:420b9abd1a7330687a095373b8280a20cdee04342fbc8ccb3b56d9ec8efd4e62"},
- {file = "pyzmq-23.2.0-cp38-cp38-win32.whl", hash = "sha256:59928dfebe93cf1e203e3cb0fd5d5dd384da56b99c8305f2e1b0a933751710f6"},
- {file = "pyzmq-23.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:c882f1d4f96fbd807e92c334251d8ebd159a1ef89059ccd386ddea83fdb91bd8"},
- {file = "pyzmq-23.2.0-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:ced12075cdf3c7332ecc1960f77f7439d5ebb8ea20bbd3c34c8299e694f1b0a1"},
- {file = "pyzmq-23.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3a4d87342c2737fbb9eee5c33c792db27b36b04957b4e6b7edd73a5b239a2a13"},
- {file = "pyzmq-23.2.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:99cedf38eaddf263cf7e2a50e405f12c02cedf6d9df00a0d9c5d7b9417b57f76"},
- {file = "pyzmq-23.2.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:d1610260cc672975723fcf7705c69a95f3b88802a594c9867781bedd9b13422c"},
- {file = "pyzmq-23.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c223a13555444707a0a7ebc6f9ee63053147c8c082bd1a31fd1207a03e8b0500"},
- {file = "pyzmq-23.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f5fdb00d65ec44b10cc6b9b6318ef1363b81647a4aa3270ca39565eadb2d1201"},
- {file = "pyzmq-23.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:984b232802eddf9f0be264a4d57a10b3a1fd7319df14ee6fc7b41c6d155a3e6c"},
- {file = "pyzmq-23.2.0-cp39-cp39-win32.whl", hash = "sha256:f146648941cadaaaf01254a75651a23c08159d009d36c5af42a7cc200a5e53ec"},
- {file = "pyzmq-23.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:83005d8928f8a5cebcfb33af3bfb84b1ad65d882b899141a331cc5d07d89f093"},
- {file = "pyzmq-23.2.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:fee86542dc4ee8229e023003e3939b4d58cc2453922cf127778b69505fc9064b"},
- {file = "pyzmq-23.2.0-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:5d57542429df6acff02ff022067aa75b677603cee70e3abb9742787545eec966"},
- {file = "pyzmq-23.2.0-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:057b154471e096e2dda147f7b057041acc303bb7ca4aa24c3b88c6cecdd78717"},
- {file = "pyzmq-23.2.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:5d92e7cbeab7f70b08cc0f27255b0bb2500afc30f31075bca0b1cb87735d186c"},
- {file = "pyzmq-23.2.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:eb4a573a8499685d62545e806d8fd143c84ac8b3439f925cd92c8763f0ed9bd7"},
- {file = "pyzmq-23.2.0-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:da338e2728410d74ddeb1479ec67cfba73311607037455a40f92b6f5c62bf11d"},
- {file = "pyzmq-23.2.0-pp38-pypy38_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1b2a21f595f8cc549abd6c8de1fcd34c83441e35fb24b8a59bf161889c62a486"},
- {file = "pyzmq-23.2.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:8c0f4d6f8c985bab83792be26ff3233940ba42e22237610ac50cbcfc10a5c235"},
- {file = "pyzmq-23.2.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:bbabd1df23bf63ae829e81200034c0e433499275a6ed29ca1a912ea7629426d9"},
- {file = "pyzmq-23.2.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:21552624ce69e69f7924f413b802b1fb554f4c0497f837810e429faa1cd4f163"},
- {file = "pyzmq-23.2.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c616893a577e9d6773a3836732fd7e2a729157a108b8fccd31c87512fa01671a"},
- {file = "pyzmq-23.2.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:ce4f71e17fa849de41a06109030d3f6815fcc33338bf98dd0dde6d456d33c929"},
- {file = "pyzmq-23.2.0.tar.gz", hash = "sha256:a51f12a8719aad9dcfb55d456022f16b90abc8dde7d3ca93ce3120b40e3fa169"},
-]
-regex = [
- {file = "regex-2022.6.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:042d122f9fee3ceb6d7e3067d56557df697d1aad4ff5f64ecce4dc13a90a7c01"},
- {file = "regex-2022.6.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ffef4b30785dc2d1604dfb7cf9fca5dc27cd86d65f7c2a9ec34d6d3ae4565ec2"},
- {file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0afa6a601acf3c0dc6de4e8d7d8bbce4e82f8542df746226cd35d4a6c15e9456"},
- {file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4a11cbe8eb5fb332ae474895b5ead99392a4ea568bd2a258ab8df883e9c2bf92"},
- {file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c1f62ee2ba880e221bc950651a1a4b0176083d70a066c83a50ef0cb9b178e12"},
- {file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5aba3d13c77173e9bfed2c2cea7fc319f11c89a36fcec08755e8fb169cf3b0df"},
- {file = "regex-2022.6.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:249437f7f5b233792234aeeecb14b0aab1566280de42dfc97c26e6f718297d68"},
- {file = "regex-2022.6.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:179410c79fa86ef318d58ace233f95b87b05a1db6dc493fa29404a43f4b215e2"},
- {file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:5e201b1232d81ca1a7a22ab2f08e1eccad4e111579fd7f3bbf60b21ef4a16cea"},
- {file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:fdecb225d0f1d50d4b26ac423e0032e76d46a788b83b4e299a520717a47d968c"},
- {file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:be57f9c7b0b423c66c266a26ad143b2c5514997c05dd32ce7ca95c8b209c2288"},
- {file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:ed657a07d8a47ef447224ea00478f1c7095065dfe70a89e7280e5f50a5725131"},
- {file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:24908aefed23dd065b4a668c0b4ca04d56b7f09d8c8e89636cf6c24e64e67a1e"},
- {file = "regex-2022.6.2-cp310-cp310-win32.whl", hash = "sha256:775694cd0bb2c4accf2f1cdd007381b33ec8b59842736fe61bdbad45f2ac7427"},
- {file = "regex-2022.6.2-cp310-cp310-win_amd64.whl", hash = "sha256:809bbbbbcf8258049b031d80932ba71627d2274029386f0452e9950bcfa2c6e8"},
- {file = "regex-2022.6.2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:ecd2b5d983eb0adf2049d41f95205bdc3de4e6cc2350e9c80d4409d3a75229de"},
- {file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f4c101746a8dac0401abefa716b357c546e61ea2e3d4a564a9db9eac57ccbce"},
- {file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:166ae7674d0a0e0f8044e7335ba86d0716c9d49465cff1b153f908e0470b8300"},
- {file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c5eac5d8a8ac9ccf00805d02a968a36f5c967db6c7d2b747ab9ed782b3b3a28b"},
- {file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f57823f35b18d82b201c1b27ce4e55f88e79e81d9ca07b50ce625d33823e1439"},
- {file = "regex-2022.6.2-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4d42e3b7b23473729adbf76103e7df75f9167a5a80b1257ca30688352b4bb2dc"},
- {file = "regex-2022.6.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:b2932e728bee0a634fe55ee54d598054a5a9ffe4cd2be21ba2b4b8e5f8064c2c"},
- {file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:17764683ea01c2b8f103d99ae9de2473a74340df13ce306c49a721f0b1f0eb9e"},
- {file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:2ac29b834100d2c171085ceba0d4a1e7046c434ddffc1434dbc7f9d59af1e945"},
- {file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:f43522fb5d676c99282ca4e2d41e8e2388427c0cf703db6b4a66e49b10b699a8"},
- {file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:9faa01818dad9111dbf2af26c6e3c45140ccbd1192c3a0981f196255bf7ec5e6"},
- {file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:17443f99b8f255273731f915fdbfea4d78d809bb9c3aaf67b889039825d06515"},
- {file = "regex-2022.6.2-cp36-cp36m-win32.whl", hash = "sha256:4a5449adef907919d4ce7a1eab2e27d0211d1b255bf0b8f5dd330ad8707e0fc3"},
- {file = "regex-2022.6.2-cp36-cp36m-win_amd64.whl", hash = "sha256:4d206703a96a39763b5b45cf42645776f5553768ea7f3c2c1a39a4f59cafd4ba"},
- {file = "regex-2022.6.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fcd7c432202bcb8b642c3f43d5bcafc5930d82fe5b2bf2c008162df258445c1d"},
- {file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:186c5a4a4c40621f64d771038ede20fca6c61a9faa8178f9e305aaa0c2442a97"},
- {file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:047b2d1323a51190c01b6604f49fe09682a5c85d3c1b2c8b67c1cd68419ce3c4"},
- {file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:30637e7fa4acfed444525b1ab9683f714be617862820578c9fd4e944d4d9ad1f"},
- {file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3adafe6f2c6d86dbf3313866b61180530ca4dcd0c264932dc8fa1ffb10871d58"},
- {file = "regex-2022.6.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:67ae3601edf86e15ebe40885e5bfdd6002d34879070be15cf18fc0d80ea24fed"},
- {file = "regex-2022.6.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:48dddddce0ea7e7c3e92c1e0c5a28c13ca4dc9cf7e996c706d00479652bff76c"},
- {file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:68e5c641645351eb9eb12c465876e76b53717f99e9b92aea7a2dd645a87aa7aa"},
- {file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:8fd5f8ae42f789538bb634bdfd69b9aa357e76fdfd7ad720f32f8994c0d84f1e"},
- {file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:71988a76fcb68cc091e901fddbcac0f9ad9a475da222c47d3cf8db0876cb5344"},
- {file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:4b8838f70be3ce9e706df9d72f88a0aa7d4c1fea61488e06fdf292ccb70ad2be"},
- {file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:663dca677bd3d2e2b5b7d0329e9f24247e6f38f3b740dd9a778a8ef41a76af41"},
- {file = "regex-2022.6.2-cp37-cp37m-win32.whl", hash = "sha256:24963f0b13cc63db336d8da2a533986419890d128c551baacd934c249d51a779"},
- {file = "regex-2022.6.2-cp37-cp37m-win_amd64.whl", hash = "sha256:ceff75127f828dfe7ceb17b94113ec2df4df274c4cd5533bb299cb099a18a8ca"},
- {file = "regex-2022.6.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a6f2698cfa8340dfe4c0597782776b393ba2274fe4c079900c7c74f68752705"},
- {file = "regex-2022.6.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a8a08ace913c4101f0dc0be605c108a3761842efd5f41a3005565ee5d169fb2b"},
- {file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:26dbe90b724efef7820c3cf4a0e5be7f130149f3d2762782e4e8ac2aea284a0b"},
- {file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b5f759a1726b995dc896e86f17f9c0582b54eb4ead00ed5ef0b5b22260eaf2d0"},
- {file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1fc26bb3415e7aa7495c000a2c13bf08ce037775db98c1a3fac9ff04478b6930"},
- {file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:52684da32d9003367dc1a1c07e059b9bbaf135ad0764cd47d8ac3dba2df109bc"},
- {file = "regex-2022.6.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c1264eb40a71cf2bff43d6694ab7254438ca19ef330175060262b3c8dd3931a"},
- {file = "regex-2022.6.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:bc635ab319c9b515236bdf327530acda99be995f9d3b9f148ab1f60b2431e970"},
- {file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:27624b490b5d8880f25dac67e1e2ea93dfef5300b98c6755f585799230d6c746"},
- {file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:555f7596fd1f123f8c3a67974c01d6ef80b9769e04d660d6c1a7cc3e6cff7069"},
- {file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:933e72fbe1829cbd59da2bc51ccd73d73162f087f88521a87a8ec9cb0cf10fa8"},
- {file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:cff5c87e941292c97d11dc81bd20679f56a2830f0f0e32f75b8ed6e0eb40f704"},
- {file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:c757f3a27b6345de13ef3ca956aa805d7734ce68023e84d0fc74e1f09ce66f7a"},
- {file = "regex-2022.6.2-cp38-cp38-win32.whl", hash = "sha256:a58d21dd1a2d6b50ed091554ff85e448fce3fe33a4db8b55d0eba2ca957ed626"},
- {file = "regex-2022.6.2-cp38-cp38-win_amd64.whl", hash = "sha256:495a4165172848503303ed05c9d0409428f789acc27050fe2cf0a4549188a7d5"},
- {file = "regex-2022.6.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1ab5cf7d09515548044e69d3a0ec77c63d7b9dfff4afc19653f638b992573126"},
- {file = "regex-2022.6.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c1ea28f0ee6cbe4c0367c939b015d915aa9875f6e061ba1cf0796ca9a3010570"},
- {file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3de1ecf26ce85521bf73897828b6d0687cc6cf271fb6ff32ac63d26b21f5e764"},
- {file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fa7c7044aabdad2329974be2246babcc21d3ede852b3971a90fd8c2056c20360"},
- {file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:53d69d77e9cfe468b000314dd656be85bb9e96de088a64f75fe128dfe1bf30dd"},
- {file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c8d61883a38b1289fba9944a19a361875b5c0170b83cdcc95ea180247c1b7d3"},
- {file = "regex-2022.6.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c5429202bef174a3760690d912e3a80060b323199a61cef6c6c29b30ce09fd17"},
- {file = "regex-2022.6.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:e85b10280cf1e334a7c95629f6cbbfe30b815a4ea5f1e28d31f79eb92c2c3d93"},
- {file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c400dfed4137f32127ea4063447006d7153c974c680bf0fb1b724cce9f8567fc"},
- {file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7f648037c503985aed39f85088acab6f1eb6a0482d7c6c665a5712c9ad9eaefc"},
- {file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:e7b2ff451f6c305b516281ec45425dd423223c8063218c5310d6f72a0a7a517c"},
- {file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:be456b4313a86be41706319c397c09d9fdd2e5cdfde208292a277b867e99e3d1"},
- {file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c3db393b21b53d7e1d3f881b64c29d886cbfdd3df007e31de68b329edbab7d02"},
- {file = "regex-2022.6.2-cp39-cp39-win32.whl", hash = "sha256:d70596f20a03cb5f935d6e4aad9170a490d88fc4633679bf00c652e9def4619e"},
- {file = "regex-2022.6.2-cp39-cp39-win_amd64.whl", hash = "sha256:3b9b6289e03dbe6a6096880d8ac166cb23c38b4896ad235edee789d4e8697152"},
- {file = "regex-2022.6.2.tar.gz", hash = "sha256:f7b43acb2c46fb2cd506965b2d9cf4c5e64c9c612bac26c1187933c7296bf08c"},
-]
+pyzmq = []
+regex = []
requests = []
requests-oauthlib = []
-responses = [
- {file = "responses-0.18.0-py3-none-any.whl", hash = "sha256:15c63ad16de13ee8e7182d99c9334f64fd81f1ee79f90748d527c28f7ca9dd51"},
- {file = "responses-0.18.0.tar.gz", hash = "sha256:380cad4c1c1dc942e5e8a8eaae0b4d4edf708f4f010db8b7bcfafad1fcd254ff"},
-]
+responses = []
rsa = []
scikit-learn = []
scipy = []
setuptools-scm = []
-six = [
- {file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
- {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
-]
+six = []
smmap = []
-soupsieve = [
- {file = "soupsieve-2.3.2.post1-py3-none-any.whl", hash = "sha256:3b2503d3c7084a42b1ebd08116e5f81aadfaea95863628c80a3b774a11b7c759"},
- {file = "soupsieve-2.3.2.post1.tar.gz", hash = "sha256:fc53893b3da2c33de295667a0e19f078c14bf86544af307354de5fcf12a3f30d"},
-]
+soupsieve = []
stevedore = []
structlog = []
tensorboard = []
tensorboard-data-server = []
tensorboard-plugin-wit = []
-threadpoolctl = [
- {file = "threadpoolctl-3.1.0-py3-none-any.whl", hash = "sha256:8b99adda265feb6773280df41eece7b2e6561b772d21ffd52e372f999024907b"},
- {file = "threadpoolctl-3.1.0.tar.gz", hash = "sha256:a335baacfaa4400ae1f0d8e3a58d6674d2f8828e3716bb2802c44955ad391380"},
-]
-tinycss2 = [
- {file = "tinycss2-1.1.1-py3-none-any.whl", hash = "sha256:fe794ceaadfe3cf3e686b22155d0da5780dd0e273471a51846d0a02bc204fec8"},
- {file = "tinycss2-1.1.1.tar.gz", hash = "sha256:b2e44dd8883c360c35dd0d1b5aad0b610e5156c2cb3b33434634e539ead9d8bf"},
-]
-tokenizers = [
- {file = "tokenizers-0.12.1-cp310-cp310-macosx_10_11_x86_64.whl", hash = "sha256:d737df0f8f26e093a82bfb106b6cfb510a0e9302d35834568e5b20b73ddc5a9c"},
- {file = "tokenizers-0.12.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f1271224acafb27639c432e1ce4e7d38eab40305ba1c546e871d5c8a32f4f195"},
- {file = "tokenizers-0.12.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cdeba37c2fb44e1aec8a72af4cb369655b59ba313181b1b4b8183f08e759c49c"},
- {file = "tokenizers-0.12.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:53b5f4012ce3ffddd5b00827441b80dc7a0f6b41f4fc5248ae6d36e7d3920c6d"},
- {file = "tokenizers-0.12.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5188e13fc09edfe05712ca3ae5a44e7f2b0137927b1ca210d0fad90d3e58315a"},
- {file = "tokenizers-0.12.1-cp310-cp310-win32.whl", hash = "sha256:eff5ff411f18a201eec137b7b32fcb55e0c48b372d370bd24f965f5bad471fa4"},
- {file = "tokenizers-0.12.1-cp310-cp310-win_amd64.whl", hash = "sha256:bdbca79726fe883c696088ea163715b2f902aec638a8e24bcf9790ff8fa45019"},
- {file = "tokenizers-0.12.1-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:28825dade9e52ad464164020758f9d49eb7251c32b6ae146601c506a23c67c0e"},
- {file = "tokenizers-0.12.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:91906d725cb84d8ee71ce05fbb155d39d494849622b4f9349e5176a8eb01c49b"},
- {file = "tokenizers-0.12.1-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:230f51a0a82ca7b90077eaca2415f12ff9bd144607888b9c50c2ee543452322e"},
- {file = "tokenizers-0.12.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d4339c376b695de2ad8ccaebffa75e4dc1d7857be1103d80e7925b34af8cf78"},
- {file = "tokenizers-0.12.1-cp37-cp37m-macosx_10_11_x86_64.whl", hash = "sha256:27d93b712aa2d4346aa506ecd4ec9e94edeebeaf2d484357b482cdeffc02b5f5"},
- {file = "tokenizers-0.12.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:7f4cb68dc538b52240d1986d2034eb0a6373be2ab5f0787d1be3ad1444ce71b7"},
- {file = "tokenizers-0.12.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ae6c04b629ac2cd2f695739988cb70b9bd8d5e7f849f5b14c4510e942bee5770"},
- {file = "tokenizers-0.12.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6a38b2019d4807d42afeff603a119094ee00f63bea2921136524c8814e9003f8"},
- {file = "tokenizers-0.12.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fde8dccb9033fa344ffce3ee1837939a50e7a210a768f1cf2059beeafa755481"},
- {file = "tokenizers-0.12.1-cp37-cp37m-win32.whl", hash = "sha256:38625595b2fd37bfcce64ff9bfb6868c07e9a7b7f205c909d94a615ce9472287"},
- {file = "tokenizers-0.12.1-cp37-cp37m-win_amd64.whl", hash = "sha256:01abe6fbfe55e4131ca0c4c3d1a9d7ef5df424a8d536e998d2a4fc0bc57935f4"},
- {file = "tokenizers-0.12.1-cp38-cp38-macosx_10_11_x86_64.whl", hash = "sha256:7c5c54080a7d5c89c990e0d478e0882dbac88926d43323a3aa236492a3c9455f"},
- {file = "tokenizers-0.12.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:419d113e3bcc4fe20a313afc47af81e62906306b08fe1601e1443d747d46af1f"},
- {file = "tokenizers-0.12.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9779944559cb7ace6a8516e402895f239b0d9d3c833c67dbaec496310e7e206"},
- {file = "tokenizers-0.12.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7d43de14b4469b57490dbaf136a31c266cb676fa22320f01f230af9219ae9034"},
- {file = "tokenizers-0.12.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:258873634406bd1d438c799993a5e44bbc0132ff055985c03c4fe30f702e9a33"},
- {file = "tokenizers-0.12.1-cp38-cp38-win32.whl", hash = "sha256:3f2647cc256d6a53d18b9dcd71d377828e9f8991fbcbd6fcd8ca2ceb174552b0"},
- {file = "tokenizers-0.12.1-cp38-cp38-win_amd64.whl", hash = "sha256:62a723bd4b18bc55121f5c34cd8efd6c651f2d3b81f81dd50e5351fb65b8a617"},
- {file = "tokenizers-0.12.1-cp39-cp39-macosx_10_11_x86_64.whl", hash = "sha256:411ebc89228f30218ffa9d9c49d414864b0df5026a47c24820431821c4360460"},
- {file = "tokenizers-0.12.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:619728df2551bdfe6f96ff177f9ded958e7ed9e2af94c8d5ac2834d1eb06d112"},
- {file = "tokenizers-0.12.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8cea98f3f9577d1541b7bb0f7a3308a911751067e1d83e01485c9d3411bbf087"},
- {file = "tokenizers-0.12.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:664f36f0a0d409c24f2201d495161fec4d8bc93e091fbb78814eb426f29905a3"},
- {file = "tokenizers-0.12.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0bf2380ad59c50222959a9b6f231339200a826fc5cb2be09ff96d8a59f65fc5e"},
- {file = "tokenizers-0.12.1-cp39-cp39-win32.whl", hash = "sha256:6a7a106d04154c2159db6cd7d042af2e2e0e53aee432f872fe6c8be45100436a"},
- {file = "tokenizers-0.12.1-cp39-cp39-win_amd64.whl", hash = "sha256:2158baf80cbc09259bfd6e0e0fc4597b611e7a72ad5443dad63918a90f1dd304"},
- {file = "tokenizers-0.12.1.tar.gz", hash = "sha256:070746f86efa6c873db341e55cf17bb5e7bdd5450330ca8eca542f5c3dab2c66"},
-]
+threadpoolctl = []
+tinycss2 = []
+tokenizers = []
toml = []
tomli = []
-torch = [
- {file = "torch-1.12.0-cp310-cp310-manylinux1_x86_64.whl", hash = "sha256:3322d33a06e440d715bb214334bd41314c94632d9a2f07d22006bf21da3a2be4"},
- {file = "torch-1.12.0-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:2568f011dddeb5990d8698cc375d237f14568ffa8489854e3b94113b4b6b7c8b"},
- {file = "torch-1.12.0-cp310-cp310-win_amd64.whl", hash = "sha256:e3e8348edca3e3cee5a67a2b452b85c57712efe1cc3ffdb87c128b3dde54534e"},
- {file = "torch-1.12.0-cp310-none-macosx_10_9_x86_64.whl", hash = "sha256:349ea3ba0c0e789e0507876c023181f13b35307aebc2e771efd0e045b8e03e84"},
- {file = "torch-1.12.0-cp310-none-macosx_11_0_arm64.whl", hash = "sha256:13c7cca6b2ea3704d775444f02af53c5f072d145247e17b8cd7813ac57869f03"},
- {file = "torch-1.12.0-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:60d06ee2abfa85f10582d205404d52889d69bcbb71f7e211cfc37e3957ac19ca"},
- {file = "torch-1.12.0-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:a1325c9c28823af497cbf443369bddac9ac59f67f1e600f8ab9b754958e55b76"},
- {file = "torch-1.12.0-cp37-cp37m-win_amd64.whl", hash = "sha256:fb47291596677570246d723ee6abbcbac07eeba89d8f83de31e3954f21f44879"},
- {file = "torch-1.12.0-cp37-none-macosx_10_9_x86_64.whl", hash = "sha256:abbdc5483359b9495dc76e3bd7911ccd2ddc57706c117f8316832e31590af871"},
- {file = "torch-1.12.0-cp37-none-macosx_11_0_arm64.whl", hash = "sha256:72207b8733523388c49d43ffcc4416d1d8cd64c40f7826332e714605ace9b1d2"},
- {file = "torch-1.12.0-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:0986685f2ec8b7c4d3593e8cfe96be85d462943f1a8f54112fc48d4d9fbbe903"},
- {file = "torch-1.12.0-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:0399746f83b4541bcb5b219a18dbe8cade760aba1c660d2748a38c6dc338ebc7"},
- {file = "torch-1.12.0-cp38-cp38-win_amd64.whl", hash = "sha256:7ddb167827170c4e3ff6a27157414a00b9fef93dea175da04caf92a0619b7aee"},
- {file = "torch-1.12.0-cp38-none-macosx_10_9_x86_64.whl", hash = "sha256:2143d5fe192fd908b70b494349de5b1ac02854a8a902bd5f47d13d85b410e430"},
- {file = "torch-1.12.0-cp38-none-macosx_11_0_arm64.whl", hash = "sha256:44a3804e9bb189574f5d02ccc2dc6e32e26a81b3e095463b7067b786048c6072"},
- {file = "torch-1.12.0-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:844f1db41173b53fe40c44b3e04fcca23a6ce00ac328b7099f2800e611766845"},
- {file = "torch-1.12.0-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:63341f96840a223f277e498d2737b39da30d9f57c7a1ef88857b920096317739"},
- {file = "torch-1.12.0-cp39-cp39-win_amd64.whl", hash = "sha256:201abf43a99bb4980cc827dd4b38ac28f35e4dddac7832718be3d5479cafd2c1"},
- {file = "torch-1.12.0-cp39-none-macosx_10_9_x86_64.whl", hash = "sha256:c0313438bc36448ffd209f5fb4e5f325b3af158cdf61c8829b8ddaf128c57816"},
- {file = "torch-1.12.0-cp39-none-macosx_11_0_arm64.whl", hash = "sha256:5ed69d5af232c5c3287d44cef998880dadcc9721cd020e9ae02f42e56b79c2e4"},
-]
+torch = []
torch-hypothesis = []
-torchmetrics = [
- {file = "torchmetrics-0.9.2-py3-none-any.whl", hash = "sha256:ced006295c95c4555df0b8dea92960c00e3303de0da878fcf27e394df4757827"},
- {file = "torchmetrics-0.9.2.tar.gz", hash = "sha256:8178c9242e243318093d9b7237738a504535193d2006da6e58b0ed4003e318d2"},
-]
+torchmetrics = []
torchvision = []
-tornado = [
- {file = "tornado-6.1-cp35-cp35m-macosx_10_9_x86_64.whl", hash = "sha256:d371e811d6b156d82aa5f9a4e08b58debf97c302a35714f6f45e35139c332e32"},
- {file = "tornado-6.1-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:0d321a39c36e5f2c4ff12b4ed58d41390460f798422c4504e09eb5678e09998c"},
- {file = "tornado-6.1-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:9de9e5188a782be6b1ce866e8a51bc76a0fbaa0e16613823fc38e4fc2556ad05"},
- {file = "tornado-6.1-cp35-cp35m-manylinux2010_i686.whl", hash = "sha256:61b32d06ae8a036a6607805e6720ef00a3c98207038444ba7fd3d169cd998910"},
- {file = "tornado-6.1-cp35-cp35m-manylinux2010_x86_64.whl", hash = "sha256:3e63498f680547ed24d2c71e6497f24bca791aca2fe116dbc2bd0ac7f191691b"},
- {file = "tornado-6.1-cp35-cp35m-manylinux2014_aarch64.whl", hash = "sha256:6c77c9937962577a6a76917845d06af6ab9197702a42e1346d8ae2e76b5e3675"},
- {file = "tornado-6.1-cp35-cp35m-win32.whl", hash = "sha256:6286efab1ed6e74b7028327365cf7346b1d777d63ab30e21a0f4d5b275fc17d5"},
- {file = "tornado-6.1-cp35-cp35m-win_amd64.whl", hash = "sha256:fa2ba70284fa42c2a5ecb35e322e68823288a4251f9ba9cc77be04ae15eada68"},
- {file = "tornado-6.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:0a00ff4561e2929a2c37ce706cb8233b7907e0cdc22eab98888aca5dd3775feb"},
- {file = "tornado-6.1-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:748290bf9112b581c525e6e6d3820621ff020ed95af6f17fedef416b27ed564c"},
- {file = "tornado-6.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:e385b637ac3acaae8022e7e47dfa7b83d3620e432e3ecb9a3f7f58f150e50921"},
- {file = "tornado-6.1-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:25ad220258349a12ae87ede08a7b04aca51237721f63b1808d39bdb4b2164558"},
- {file = "tornado-6.1-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:65d98939f1a2e74b58839f8c4dab3b6b3c1ce84972ae712be02845e65391ac7c"},
- {file = "tornado-6.1-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:e519d64089b0876c7b467274468709dadf11e41d65f63bba207e04217f47c085"},
- {file = "tornado-6.1-cp36-cp36m-win32.whl", hash = "sha256:b87936fd2c317b6ee08a5741ea06b9d11a6074ef4cc42e031bc6403f82a32575"},
- {file = "tornado-6.1-cp36-cp36m-win_amd64.whl", hash = "sha256:cc0ee35043162abbf717b7df924597ade8e5395e7b66d18270116f8745ceb795"},
- {file = "tornado-6.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:7250a3fa399f08ec9cb3f7b1b987955d17e044f1ade821b32e5f435130250d7f"},
- {file = "tornado-6.1-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:ed3ad863b1b40cd1d4bd21e7498329ccaece75db5a5bf58cd3c9f130843e7102"},
- {file = "tornado-6.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:dcef026f608f678c118779cd6591c8af6e9b4155c44e0d1bc0c87c036fb8c8c4"},
- {file = "tornado-6.1-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:70dec29e8ac485dbf57481baee40781c63e381bebea080991893cd297742b8fd"},
- {file = "tornado-6.1-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:d3f7594930c423fd9f5d1a76bee85a2c36fd8b4b16921cae7e965f22575e9c01"},
- {file = "tornado-6.1-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:3447475585bae2e77ecb832fc0300c3695516a47d46cefa0528181a34c5b9d3d"},
- {file = "tornado-6.1-cp37-cp37m-win32.whl", hash = "sha256:e7229e60ac41a1202444497ddde70a48d33909e484f96eb0da9baf8dc68541df"},
- {file = "tornado-6.1-cp37-cp37m-win_amd64.whl", hash = "sha256:cb5ec8eead331e3bb4ce8066cf06d2dfef1bfb1b2a73082dfe8a161301b76e37"},
- {file = "tornado-6.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:20241b3cb4f425e971cb0a8e4ffc9b0a861530ae3c52f2b0434e6c1b57e9fd95"},
- {file = "tornado-6.1-cp38-cp38-manylinux1_i686.whl", hash = "sha256:c77da1263aa361938476f04c4b6c8916001b90b2c2fdd92d8d535e1af48fba5a"},
- {file = "tornado-6.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:fba85b6cd9c39be262fcd23865652920832b61583de2a2ca907dbd8e8a8c81e5"},
- {file = "tornado-6.1-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:1e8225a1070cd8eec59a996c43229fe8f95689cb16e552d130b9793cb570a288"},
- {file = "tornado-6.1-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:d14d30e7f46a0476efb0deb5b61343b1526f73ebb5ed84f23dc794bdb88f9d9f"},
- {file = "tornado-6.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:8f959b26f2634a091bb42241c3ed8d3cedb506e7c27b8dd5c7b9f745318ddbb6"},
- {file = "tornado-6.1-cp38-cp38-win32.whl", hash = "sha256:34ca2dac9e4d7afb0bed4677512e36a52f09caa6fded70b4e3e1c89dbd92c326"},
- {file = "tornado-6.1-cp38-cp38-win_amd64.whl", hash = "sha256:6196a5c39286cc37c024cd78834fb9345e464525d8991c21e908cc046d1cc02c"},
- {file = "tornado-6.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f0ba29bafd8e7e22920567ce0d232c26d4d47c8b5cf4ed7b562b5db39fa199c5"},
- {file = "tornado-6.1-cp39-cp39-manylinux1_i686.whl", hash = "sha256:33892118b165401f291070100d6d09359ca74addda679b60390b09f8ef325ffe"},
- {file = "tornado-6.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:7da13da6f985aab7f6f28debab00c67ff9cbacd588e8477034c0652ac141feea"},
- {file = "tornado-6.1-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:e0791ac58d91ac58f694d8d2957884df8e4e2f6687cdf367ef7eb7497f79eaa2"},
- {file = "tornado-6.1-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:66324e4e1beede9ac79e60f88de548da58b1f8ab4b2f1354d8375774f997e6c0"},
- {file = "tornado-6.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:a48900ecea1cbb71b8c71c620dee15b62f85f7c14189bdeee54966fbd9a0c5bd"},
- {file = "tornado-6.1-cp39-cp39-win32.whl", hash = "sha256:d3d20ea5782ba63ed13bc2b8c291a053c8d807a8fa927d941bd718468f7b950c"},
- {file = "tornado-6.1-cp39-cp39-win_amd64.whl", hash = "sha256:548430be2740e327b3fe0201abe471f314741efcb0067ec4f2d7dcfb4825f3e4"},
- {file = "tornado-6.1.tar.gz", hash = "sha256:33c6e81d7bd55b468d2e793517c909b139960b6c790a60b7991b9b6b76fb9791"},
-]
-tqdm = [
- {file = "tqdm-4.64.0-py2.py3-none-any.whl", hash = "sha256:74a2cdefe14d11442cedf3ba4e21a3b84ff9a2dbdc6cfae2c34addb2a14a5ea6"},
- {file = "tqdm-4.64.0.tar.gz", hash = "sha256:40be55d30e200777a307a7585aee69e4eabb46b4ec6a4b4a5f2d9f11e7d5408d"},
-]
-traitlets = [
- {file = "traitlets-5.3.0-py3-none-any.whl", hash = "sha256:65fa18961659635933100db8ca120ef6220555286949774b9cfc106f941d1c7a"},
- {file = "traitlets-5.3.0.tar.gz", hash = "sha256:0bb9f1f9f017aa8ec187d8b1b2a7a6626a2a1d877116baba52a129bfa124f8e2"},
-]
-transformers = [
- {file = "transformers-4.20.1-py3-none-any.whl", hash = "sha256:d284eaf60b10fee45b24688423b5f7ba2d194f8c2dadf8df76cd58c1a9d08b52"},
- {file = "transformers-4.20.1.tar.gz", hash = "sha256:65ee4ae9abdeca8fe3a9e351256345e3c4db2a6a68accd5d6a141cfff6192751"},
-]
+tornado = []
+tqdm = []
+traitlets = []
+transformers = []
typed-ast = []
typing-extensions = []
-urllib3 = [
- {file = "urllib3-1.26.9-py2.py3-none-any.whl", hash = "sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14"},
- {file = "urllib3-1.26.9.tar.gz", hash = "sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e"},
-]
+urllib3 = []
watchdog = []
-webencodings = [
- {file = "webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78"},
- {file = "webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923"},
-]
+webencodings = []
werkzeug = []
-xxhash = [
- {file = "xxhash-3.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:219cba13991fd73cf21a5efdafa5056f0ae0b8f79e5e0112967e3058daf73eea"},
- {file = "xxhash-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3fcbb846af15eff100c412ae54f4974ff277c92eacd41f1ec7803a64fd07fa0c"},
- {file = "xxhash-3.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f475fa817ff7955fc118fc1ca29a6e691d329b7ff43f486af36c22dbdcff1db"},
- {file = "xxhash-3.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9200a90f02ff6fd5fb63dea107842da71d8626d99b768fd31be44f3002c60bbe"},
- {file = "xxhash-3.0.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a1403e4f551c9ef7bcef09af55f1adb169f13e4de253db0887928e5129f87af1"},
- {file = "xxhash-3.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa7f6ca53170189a2268c83af0980e6c10aae69e6a5efa7ca989f89fff9f8c02"},
- {file = "xxhash-3.0.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5b63fbeb6d9c93d50ae0dc2b8a8b7f52f2de19e40fe9edc86637bfa5743b8ba2"},
- {file = "xxhash-3.0.0-cp310-cp310-win32.whl", hash = "sha256:31f25efd10b6f1f6d5c34cd231986d8aae9a42e042daa90b783917f170807869"},
- {file = "xxhash-3.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:807e88ed56e0fb347cb57d5bf44851f9878360fed700f2f63e622ef4eede87a5"},
- {file = "xxhash-3.0.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:6d612c55a75d84d25898f6c5ad6a589aa556d1cb9af770b6c574ee62995167f6"},
- {file = "xxhash-3.0.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6f9309fcaf73f93df3101f03a61dc30644adff3e8d0044fff8c0c195dbbe63e2"},
- {file = "xxhash-3.0.0-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a2273fe40720e86346a17f06ef95cd60ee0d66ffce7cf55e390ef7350112b16d"},
- {file = "xxhash-3.0.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fc6f3a334587c83c5ba56c19b254a97542ce1fc05ccfd66fbf568e6117718d65"},
- {file = "xxhash-3.0.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:36cf410da5bfcca51ac3c2c51a3317dcd7af91f70fa61eca57fba39554f06ae3"},
- {file = "xxhash-3.0.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:21752a3e9a2391d91bd51f4aa2fe028ae14ba6a8d37db9ebe00ccac10be5ac4a"},
- {file = "xxhash-3.0.0-cp36-cp36m-win32.whl", hash = "sha256:322068a063ef156455a401ab720f0892f2d2dd1540c1a308e95a7cbf356df51c"},
- {file = "xxhash-3.0.0-cp36-cp36m-win_amd64.whl", hash = "sha256:2984fa9a880587c0bfa46d32717b2d209863ee68727ea0fc17f05fce25efa692"},
- {file = "xxhash-3.0.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:6493dd938b360235da81b1c79d8cd048c4f11977e1159b4e744c54f98d3a7bb4"},
- {file = "xxhash-3.0.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb9eca32f9b4acc7149db2c86f8108167b9929b7da1887d4287a90cfdb3ea53a"},
- {file = "xxhash-3.0.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f4125e70e4e1d79992d81de837a0586aa0241665dbc5ce01b9c89330ed5cbb66"},
- {file = "xxhash-3.0.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:583bea142569485bdb0c5900e804058c16edba1850b74519688c22bc546e6175"},
- {file = "xxhash-3.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f3adf2891acc18abacd15113e9cbbefd30e5f4ecaae32c23e5486fc09c76ea5"},
- {file = "xxhash-3.0.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ed65a2671d380ae05262ce1e4ccc2b63f3c30506d207bf6fae8cd72be0ad65d4"},
- {file = "xxhash-3.0.0-cp37-cp37m-win32.whl", hash = "sha256:c604b3dcac9d37e3fceaa11884927024953260cc4224d9b89400d16e6cf34021"},
- {file = "xxhash-3.0.0-cp37-cp37m-win_amd64.whl", hash = "sha256:1c6fc59e182506496544bc6d426bcf6077066ed1b40cfcd937f707cc06c7ef50"},
- {file = "xxhash-3.0.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:5628375dbb76d33b93b44854a6c5433e2a78115e03ea2ae1bb74a34ab012a43f"},
- {file = "xxhash-3.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:687aa4373690f23a3f43cc23d81005304d284ff6c041bff1f967664ab6410f36"},
- {file = "xxhash-3.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9fa2100fb68b163e99370561c9e29ed37b9153fe99443600bea28829150eb0e4"},
- {file = "xxhash-3.0.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:891d7651431a055f76fe2c8f86c593c3dede8ec5b10ca55e8ff5c9fdceb55f0b"},
- {file = "xxhash-3.0.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:197c32d7b62be02957ca31aa69febadf9c5a34ef953053ea16e2c72465bc450f"},
- {file = "xxhash-3.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:91fa4df41bda3cbec4084d9696028780b47128c1f8450d1ad9c3e4b6bf8b1f99"},
- {file = "xxhash-3.0.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4cd38b766fc40e9fe37b80112656d2e5a0cb2f9bc12e01b286353b5ecd2768e8"},
- {file = "xxhash-3.0.0-cp38-cp38-win32.whl", hash = "sha256:4258ef78f5a7d1f9c595846134c7d81a868c74942051453258eb383498662d4d"},
- {file = "xxhash-3.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:b82b1cf4407ad908e04e864473cc3baa8e764c7bbebea959150764cc681a1611"},
- {file = "xxhash-3.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:da4d91e28418469b29eed8635c08af28b588e51cd04288bed1ba1cf60f2d91f6"},
- {file = "xxhash-3.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:48aab36169b0c00e586cb4eb2814ab8bfed686933126019906f917ff9a78c99e"},
- {file = "xxhash-3.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b0d522570c9ccea6203b3d96ac7f0cfc1d29e613640475d513be432545c48cc"},
- {file = "xxhash-3.0.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d6054434ddb060685e86e7457f52d188b0886834baaa532f9f78b4f2b53cfd9b"},
- {file = "xxhash-3.0.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cbf546ca5f5903ceeb46d9e6abf81f3a64edb95bb7dbe0f75283eec93a7eb2a0"},
- {file = "xxhash-3.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22704f23f23ccbe892cee3e7568c67f07ac25beaa2d1cff183274005d9d39149"},
- {file = "xxhash-3.0.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:83198e223bcc4b2418b5282ac930e444738c2a33859dee4e570b25c8433d83a2"},
- {file = "xxhash-3.0.0-cp39-cp39-win32.whl", hash = "sha256:3bcd4cd9b22293ea1c08822518fbb6d933c2960d66662d468a1945a45cace194"},
- {file = "xxhash-3.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:f5dd4c37da3408d56ae942dc103f4ae3b43510daa4f5accd0a411fc6e914f10a"},
- {file = "xxhash-3.0.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:485f172abc03f78afd4f38dbdbb5665f59c5487126fa4c3181c6582cda4de03b"},
- {file = "xxhash-3.0.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:035248b3d7ab6deb7b247278494d293b9faccfa853078319d25e2926f566b2f8"},
- {file = "xxhash-3.0.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b30ae90c0cfd10ffe852c6b0f263253782eea74a8189d5f2440f6595c1e8047e"},
- {file = "xxhash-3.0.0-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8fd203d8a3c013e679722047ef4f061f690c6cff49380622444bca4c30f3bf23"},
- {file = "xxhash-3.0.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:6d60059aaef12a01c0cc24f1d7aaaab7933ae9f4b7adfd9ebbd37dc7ceac1745"},
- {file = "xxhash-3.0.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:676c97bf7cc298b65eec0368c2cb5611d87a8e876930843311ca728f69292752"},
- {file = "xxhash-3.0.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2245c6e20e96e3f8fdfb61ad6bc5cde6ce8a1c2b93aa4a32a27bba7ab3aeaf12"},
- {file = "xxhash-3.0.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ae926a52d020085a2d7f69d0e2155cbf819ae409f2e5dbb345dd40a6462de32"},
- {file = "xxhash-3.0.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a2efdcb811be3edc520b78364c11a1e54f5d8e5db895a9ff2bcdd4a7ffa36a5"},
- {file = "xxhash-3.0.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:885b3a851980056707ab99a2c19c35dfe2c2ba5f602066dbfcd8af45ea855760"},
- {file = "xxhash-3.0.0.tar.gz", hash = "sha256:30b2d97aaf11fb122023f6b44ebb97c6955e9e00d7461a96415ca030b5ceb9c7"},
-]
-yarl = [
- {file = "yarl-1.7.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f2a8508f7350512434e41065684076f640ecce176d262a7d54f0da41d99c5a95"},
- {file = "yarl-1.7.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:da6df107b9ccfe52d3a48165e48d72db0eca3e3029b5b8cb4fe6ee3cb870ba8b"},
- {file = "yarl-1.7.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a1d0894f238763717bdcfea74558c94e3bc34aeacd3351d769460c1a586a8b05"},
- {file = "yarl-1.7.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dfe4b95b7e00c6635a72e2d00b478e8a28bfb122dc76349a06e20792eb53a523"},
- {file = "yarl-1.7.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c145ab54702334c42237a6c6c4cc08703b6aa9b94e2f227ceb3d477d20c36c63"},
- {file = "yarl-1.7.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1ca56f002eaf7998b5fcf73b2421790da9d2586331805f38acd9997743114e98"},
- {file = "yarl-1.7.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:1d3d5ad8ea96bd6d643d80c7b8d5977b4e2fb1bab6c9da7322616fd26203d125"},
- {file = "yarl-1.7.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:167ab7f64e409e9bdd99333fe8c67b5574a1f0495dcfd905bc7454e766729b9e"},
- {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:95a1873b6c0dd1c437fb3bb4a4aaa699a48c218ac7ca1e74b0bee0ab16c7d60d"},
- {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6152224d0a1eb254f97df3997d79dadd8bb2c1a02ef283dbb34b97d4f8492d23"},
- {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:5bb7d54b8f61ba6eee541fba4b83d22b8a046b4ef4d8eb7f15a7e35db2e1e245"},
- {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:9c1f083e7e71b2dd01f7cd7434a5f88c15213194df38bc29b388ccdf1492b739"},
- {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f44477ae29025d8ea87ec308539f95963ffdc31a82f42ca9deecf2d505242e72"},
- {file = "yarl-1.7.2-cp310-cp310-win32.whl", hash = "sha256:cff3ba513db55cc6a35076f32c4cdc27032bd075c9faef31fec749e64b45d26c"},
- {file = "yarl-1.7.2-cp310-cp310-win_amd64.whl", hash = "sha256:c9c6d927e098c2d360695f2e9d38870b2e92e0919be07dbe339aefa32a090265"},
- {file = "yarl-1.7.2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:9b4c77d92d56a4c5027572752aa35082e40c561eec776048330d2907aead891d"},
- {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c01a89a44bb672c38f42b49cdb0ad667b116d731b3f4c896f72302ff77d71656"},
- {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c19324a1c5399b602f3b6e7db9478e5b1adf5cf58901996fc973fe4fccd73eed"},
- {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3abddf0b8e41445426d29f955b24aeecc83fa1072be1be4e0d194134a7d9baee"},
- {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:6a1a9fe17621af43e9b9fcea8bd088ba682c8192d744b386ee3c47b56eaabb2c"},
- {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8b0915ee85150963a9504c10de4e4729ae700af11df0dc5550e6587ed7891e92"},
- {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:29e0656d5497733dcddc21797da5a2ab990c0cb9719f1f969e58a4abac66234d"},
- {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:bf19725fec28452474d9887a128e98dd67eee7b7d52e932e6949c532d820dc3b"},
- {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:d6f3d62e16c10e88d2168ba2d065aa374e3c538998ed04996cd373ff2036d64c"},
- {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:ac10bbac36cd89eac19f4e51c032ba6b412b3892b685076f4acd2de18ca990aa"},
- {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:aa32aaa97d8b2ed4e54dc65d241a0da1c627454950f7d7b1f95b13985afd6c5d"},
- {file = "yarl-1.7.2-cp36-cp36m-win32.whl", hash = "sha256:87f6e082bce21464857ba58b569370e7b547d239ca22248be68ea5d6b51464a1"},
- {file = "yarl-1.7.2-cp36-cp36m-win_amd64.whl", hash = "sha256:ac35ccde589ab6a1870a484ed136d49a26bcd06b6a1c6397b1967ca13ceb3913"},
- {file = "yarl-1.7.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a467a431a0817a292121c13cbe637348b546e6ef47ca14a790aa2fa8cc93df63"},
- {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ab0c3274d0a846840bf6c27d2c60ba771a12e4d7586bf550eefc2df0b56b3b4"},
- {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d260d4dc495c05d6600264a197d9d6f7fc9347f21d2594926202fd08cf89a8ba"},
- {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fc4dd8b01a8112809e6b636b00f487846956402834a7fd59d46d4f4267181c41"},
- {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:c1164a2eac148d85bbdd23e07dfcc930f2e633220f3eb3c3e2a25f6148c2819e"},
- {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:67e94028817defe5e705079b10a8438b8cb56e7115fa01640e9c0bb3edf67332"},
- {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:89ccbf58e6a0ab89d487c92a490cb5660d06c3a47ca08872859672f9c511fc52"},
- {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:8cce6f9fa3df25f55521fbb5c7e4a736683148bcc0c75b21863789e5185f9185"},
- {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:211fcd65c58bf250fb994b53bc45a442ddc9f441f6fec53e65de8cba48ded986"},
- {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c10ea1e80a697cf7d80d1ed414b5cb8f1eec07d618f54637067ae3c0334133c4"},
- {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:52690eb521d690ab041c3919666bea13ab9fbff80d615ec16fa81a297131276b"},
- {file = "yarl-1.7.2-cp37-cp37m-win32.whl", hash = "sha256:695ba021a9e04418507fa930d5f0704edbce47076bdcfeeaba1c83683e5649d1"},
- {file = "yarl-1.7.2-cp37-cp37m-win_amd64.whl", hash = "sha256:c17965ff3706beedafd458c452bf15bac693ecd146a60a06a214614dc097a271"},
- {file = "yarl-1.7.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:fce78593346c014d0d986b7ebc80d782b7f5e19843ca798ed62f8e3ba8728576"},
- {file = "yarl-1.7.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c2a1ac41a6aa980db03d098a5531f13985edcb451bcd9d00670b03129922cd0d"},
- {file = "yarl-1.7.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:39d5493c5ecd75c8093fa7700a2fb5c94fe28c839c8e40144b7ab7ccba6938c8"},
- {file = "yarl-1.7.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1eb6480ef366d75b54c68164094a6a560c247370a68c02dddb11f20c4c6d3c9d"},
- {file = "yarl-1.7.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ba63585a89c9885f18331a55d25fe81dc2d82b71311ff8bd378fc8004202ff6"},
- {file = "yarl-1.7.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e39378894ee6ae9f555ae2de332d513a5763276a9265f8e7cbaeb1b1ee74623a"},
- {file = "yarl-1.7.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:c0910c6b6c31359d2f6184828888c983d54d09d581a4a23547a35f1d0b9484b1"},
- {file = "yarl-1.7.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6feca8b6bfb9eef6ee057628e71e1734caf520a907b6ec0d62839e8293e945c0"},
- {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:8300401dc88cad23f5b4e4c1226f44a5aa696436a4026e456fe0e5d2f7f486e6"},
- {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:788713c2896f426a4e166b11f4ec538b5736294ebf7d5f654ae445fd44270832"},
- {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:fd547ec596d90c8676e369dd8a581a21227fe9b4ad37d0dc7feb4ccf544c2d59"},
- {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:737e401cd0c493f7e3dd4db72aca11cfe069531c9761b8ea474926936b3c57c8"},
- {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:baf81561f2972fb895e7844882898bda1eef4b07b5b385bcd308d2098f1a767b"},
- {file = "yarl-1.7.2-cp38-cp38-win32.whl", hash = "sha256:ede3b46cdb719c794427dcce9d8beb4abe8b9aa1e97526cc20de9bd6583ad1ef"},
- {file = "yarl-1.7.2-cp38-cp38-win_amd64.whl", hash = "sha256:cc8b7a7254c0fc3187d43d6cb54b5032d2365efd1df0cd1749c0c4df5f0ad45f"},
- {file = "yarl-1.7.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:580c1f15500e137a8c37053e4cbf6058944d4c114701fa59944607505c2fe3a0"},
- {file = "yarl-1.7.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3ec1d9a0d7780416e657f1e405ba35ec1ba453a4f1511eb8b9fbab81cb8b3ce1"},
- {file = "yarl-1.7.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3bf8cfe8856708ede6a73907bf0501f2dc4e104085e070a41f5d88e7faf237f3"},
- {file = "yarl-1.7.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1be4bbb3d27a4e9aa5f3df2ab61e3701ce8fcbd3e9846dbce7c033a7e8136746"},
- {file = "yarl-1.7.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:534b047277a9a19d858cde163aba93f3e1677d5acd92f7d10ace419d478540de"},
- {file = "yarl-1.7.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6ddcd80d79c96eb19c354d9dca95291589c5954099836b7c8d29278a7ec0bda"},
- {file = "yarl-1.7.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:9bfcd43c65fbb339dc7086b5315750efa42a34eefad0256ba114cd8ad3896f4b"},
- {file = "yarl-1.7.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f64394bd7ceef1237cc604b5a89bf748c95982a84bcd3c4bbeb40f685c810794"},
- {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:044daf3012e43d4b3538562da94a88fb12a6490652dbc29fb19adfa02cf72eac"},
- {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:368bcf400247318382cc150aaa632582d0780b28ee6053cd80268c7e72796dec"},
- {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:bab827163113177aee910adb1f48ff7af31ee0289f434f7e22d10baf624a6dfe"},
- {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0cba38120db72123db7c58322fa69e3c0efa933040ffb586c3a87c063ec7cae8"},
- {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:59218fef177296451b23214c91ea3aba7858b4ae3306dde120224cfe0f7a6ee8"},
- {file = "yarl-1.7.2-cp39-cp39-win32.whl", hash = "sha256:1edc172dcca3f11b38a9d5c7505c83c1913c0addc99cd28e993efeaafdfaa18d"},
- {file = "yarl-1.7.2-cp39-cp39-win_amd64.whl", hash = "sha256:797c2c412b04403d2da075fb93c123df35239cd7b4cc4e0cd9e5839b73f52c58"},
- {file = "yarl-1.7.2.tar.gz", hash = "sha256:45399b46d60c253327a460e99856752009fcee5f5d3c80b2f7c0cae1c38d56dd"},
-]
-zipp = [
- {file = "zipp-3.8.0-py3-none-any.whl", hash = "sha256:c4f6e5bbf48e74f7a38e7cc5b0480ff42b0ae5178957d564d18932525d5cf099"},
- {file = "zipp-3.8.0.tar.gz", hash = "sha256:56bf8aadb83c24db6c4b577e13de374ccfb67da2078beba1d037c17980bf43ad"},
-]
+xxhash = []
+yarl = []
+zipp = []
diff --git a/pyproject.toml b/pyproject.toml
index 6a753348..b39f4791 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -49,7 +49,7 @@ docutils = "0.16"
# Documentation
mkdocs-jupyter = "^0.21.0"
-mkdocs-material = "^8.3.6"
+mkdocs-material = "^8.3.9"
Pygments = "^2.12.0"
mkdocstrings = {extras = ["python"], version = "^0.18.1"}
From dd8f4b47340527d45ef99c28d942175c8ada4b17 Mon Sep 17 00:00:00 2001
From: Dref360
Date: Sun, 21 Aug 2022 12:32:35 -0400
Subject: [PATCH 06/10] Fix search by excluding active_learnign_process
---
mkdocs.yml | 5 +
notebooks/active_learning_process.ipynb | 326 +++++++++++-------------
poetry.lock | 94 ++++---
pyproject.toml | 1 +
4 files changed, 211 insertions(+), 215 deletions(-)
diff --git a/mkdocs.yml b/mkdocs.yml
index 2febadb1..c97133d3 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -27,9 +27,14 @@ theme:
icon:
repo: fontawesome/brands/github
plugins:
+ - search
+ - exclude-search:
+ exclude:
+ - notebooks/active_learning_process.md
- mkdocs-jupyter
- mkdocstrings
+
markdown_extensions:
- md_in_html
- attr_list
diff --git a/notebooks/active_learning_process.ipynb b/notebooks/active_learning_process.ipynb
index 0bd05c6c..00b57ee3 100644
--- a/notebooks/active_learning_process.ipynb
+++ b/notebooks/active_learning_process.ipynb
@@ -2,11 +2,6 @@
"cells": [
{
"cell_type": "markdown",
- "metadata": {
- "pycharm": {
- "name": "#%% md\n"
- }
- },
"source": [
"# How to do research and visualize progress\n",
"\n",
@@ -24,14 +19,6 @@
"\n",
"Today we will focus on a simple example with CIFAR10 and we will animate the progress of active learning!\n",
"\n",
- "#### Requirements\n",
- "\n",
- "In addition to BaaL standard requirements, you will need to install:\n",
- "\n",
- "* MulticoreTSNE\n",
- "* Matplotlib\n",
- "\n",
- "\n",
"#### Additional resources\n",
"\n",
"* More info on the inner working of Active Learning Dataset [here](./fundamentals/active-learning.ipynb).\n",
@@ -39,16 +26,17 @@
" [Literature review](https://baal.readthedocs.io/en/latest/literature/core-papers.html).\n",
"\n",
"### Let's do this!"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 1,
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%%\n"
+ "name": "#%% md\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
"outputs": [],
"source": [
"# Let's start with a bunch of imports.\n",
@@ -60,6 +48,7 @@
"import numpy as np\n",
"import torch\n",
"import torch.backends\n",
+ "import torch.utils.data as torchdata\n",
"from torch import optim\n",
"from torch.hub import load_state_dict_from_url\n",
"from torch.nn import CrossEntropyLoss\n",
@@ -73,21 +62,23 @@
"from baal.bayesian.dropout import patch_module\n",
"from baal.modelwrapper import ModelWrapper\n",
"\n",
+ "\n",
"def vgg16(num_classes):\n",
" model = models.vgg16(pretrained=False, num_classes=num_classes)\n",
" weights = load_state_dict_from_url('https://download.pytorch.org/models/vgg16-397923af.pth')\n",
" weights = {k: v for k, v in weights.items() if 'classifier.6' not in k}\n",
" model.load_state_dict(weights, strict=False)\n",
" return model"
- ]
- },
- {
- "cell_type": "markdown",
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%% md\n"
+ "name": "#%%\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "markdown",
"source": [
"### Dataset management and the pool\n",
"\n",
@@ -120,68 +111,37 @@
"`ActiveLearningDataset(your_dataset, pool_specifics:{'transform': test_transform}`\n",
"\n",
"where `test_transform` is the test version of `transform` without data augmentation.\n"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 2,
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%%\n"
+ "name": "#%% md\n"
}
- },
- "outputs": [],
- "source": [
- "\"\"\"\n",
- "We will make an adapter so that `pool_specifics` modifies the transform correctly.\n",
- "Because the training set is now a torchdata.Subset, modifying the `transform` attribute is harder.\n",
- "\"\"\"\n",
- "\n",
- "import torch.utils.data as torchdata\n",
- "\n",
- "\n",
- "class TransformAdapter(torchdata.Subset):\n",
- "\n",
- " @property\n",
- " def transform(self):\n",
- " if hasattr(self.dataset, 'transform'):\n",
- " return self.dataset.transform\n",
- " else:\n",
- " raise AttributeError()\n",
- "\n",
- " @transform.setter\n",
- " def transform(self, transform):\n",
- " if hasattr(self.dataset, 'transform'):\n",
- " self.dataset.transform = transform"
- ]
+ }
},
{
"cell_type": "markdown",
+ "source": [
+ "Here we define our Experiment configuration, this can come from your favorite experiment manager like MLFlow.\n",
+ "BaaL does not expect a particular format as all arguments are supplied."
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
"name": "#%% md\n"
}
- },
- "source": [
- "Here we define our Experiment configuration, this can come from your favorite experiment manager like MLFlow.\n",
- "BaaL does not expect a particular format as all arguments are supplied."
- ]
+ }
},
{
"cell_type": "code",
- "execution_count": 3,
- "metadata": {
- "pycharm": {
- "name": "#%%\n"
- }
- },
+ "execution_count": null,
"outputs": [],
"source": [
"\n",
"\n",
"@dataclass\n",
"class ExperimentConfig:\n",
- " epoch: int = 20000//100\n",
+ " epoch: int = 20000 // 100\n",
" batch_size: int = 32\n",
" initial_pool: int = 512\n",
" query_size: int = 100\n",
@@ -189,31 +149,33 @@
" heuristic: str = 'bald'\n",
" iterations: int = 40\n",
" training_duration: int = 10\n",
- " \n"
- ]
- },
- {
- "cell_type": "markdown",
+ "\n"
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%% md\n"
+ "name": "#%%\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "markdown",
"source": [
"### Problem definition\n",
"\n",
"We will perform active learning on a toy dataset, CIFAR-3 where we only keep dogs, cats and airplanes. This will make\n",
"visualization easier."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 4,
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%%\n"
+ "name": "#%% md\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
"outputs": [],
"source": [
"def get_datasets(initial_pool):\n",
@@ -228,6 +190,22 @@
" Returns:\n",
" ActiveLearningDataset, Dataset, the training and test set.\n",
" \"\"\"\n",
+ "\n",
+ " class TransformAdapter(torchdata.Subset):\n",
+ " # We need a custom Subset class as we need to override \"transforms\" as well.\n",
+ " # This shouldn't be needed for your experiments.\n",
+ " @property\n",
+ " def transform(self):\n",
+ " if hasattr(self.dataset, 'transform'):\n",
+ " return self.dataset.transform\n",
+ " else:\n",
+ " raise AttributeError()\n",
+ "\n",
+ " @transform.setter\n",
+ " def transform(self, transform):\n",
+ " if hasattr(self.dataset, 'transform'):\n",
+ " self.dataset.transform = transform\n",
+ "\n",
" # airplane, cat, dog\n",
" classes_to_keep = [0, 3, 5]\n",
" transform = transforms.Compose(\n",
@@ -245,31 +223,32 @@
" )\n",
" train_ds = datasets.CIFAR10('.', train=True,\n",
" transform=transform, target_transform=None, download=True)\n",
- " \n",
+ "\n",
" train_mask = np.where([y in classes_to_keep for y in train_ds.targets])[0]\n",
" train_ds = TransformAdapter(train_ds, train_mask)\n",
- " \n",
+ "\n",
" # In a real application, you will want a validation set here.\n",
" test_set = datasets.CIFAR10('.', train=False,\n",
" transform=test_transform, target_transform=None, download=True)\n",
" test_mask = np.where([y in classes_to_keep for y in test_set.targets])[0]\n",
" test_set = TransformAdapter(test_set, test_mask)\n",
- " \n",
+ "\n",
" # Here we set `pool_specifics`, where we set the transform attribute for the pool.\n",
" active_set = ActiveLearningDataset(train_ds, pool_specifics={'transform': test_transform})\n",
"\n",
" # We start labeling randomly.\n",
" active_set.label_randomly(initial_pool)\n",
" return active_set, test_set"
- ]
- },
- {
- "cell_type": "markdown",
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%% md\n"
+ "name": "#%%\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "markdown",
"source": [
"## Creating our experiment\n",
"\n",
@@ -286,26 +265,18 @@
" * Training/testing loops\n",
"* ActiveLearningLoop\n",
" * Will make prediction on the pool and label the most uncertain examples."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 5,
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%%\n"
- }
- },
- "outputs": [
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "Files already downloaded and verified\n",
- "Files already downloaded and verified\n"
- ]
+ "name": "#%% md\n"
}
- ],
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "outputs": [],
"source": [
"hyperparams = ExperimentConfig()\n",
"use_cuda = torch.cuda.is_available()\n",
@@ -346,15 +317,16 @@
"\n",
"# We will reset the weights at each active learning step so we make a copy.\n",
"init_weights = deepcopy(model.state_dict())"
- ]
- },
- {
- "cell_type": "markdown",
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%% md\n"
+ "name": "#%%\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "markdown",
"source": [
"### What is an active learning loop\n",
"\n",
@@ -363,27 +335,27 @@
"1. Training\n",
"2. Estimate uncertainty on the pool\n",
"3. Label the most uncertain examples.\n"
- ]
+ ],
+ "metadata": {
+ "collapsed": false,
+ "pycharm": {
+ "name": "#%% md\n"
+ }
+ }
},
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "pycharm": {
- "name": "#%%\n",
- "is_executing": true
- }
- },
"outputs": [],
"source": [
"labelling_progress = active_set._labelled.copy().astype(np.uint16)\n",
"for epoch in tqdm(range(hyperparams.epoch)):\n",
" # Load the initial weights.\n",
" model.load_state_dict(init_weights)\n",
- " \n",
+ "\n",
" # Train the model on the currently labelled dataset.\n",
" _ = model.train_on_dataset(active_set, optimizer=optimizer, batch_size=hyperparams.batch_size,\n",
- " use_cuda=use_cuda, epoch=hyperparams.training_duration)\n",
+ " use_cuda=use_cuda, epoch=hyperparams.training_duration)\n",
"\n",
" # Get test NLL!\n",
" model.test_on_dataset(test_set, hyperparams.batch_size, use_cuda,\n",
@@ -395,7 +367,7 @@
" # Keep track of progress\n",
" labelling_progress += active_set._labelled.astype(np.uint16)\n",
" if not should_continue:\n",
- " break\n",
+ " break\n",
"\n",
" test_loss = metrics['test_loss'].value\n",
" logs = {\n",
@@ -403,68 +375,64 @@
" \"epoch\": epoch,\n",
" \"Next Training set size\": len(active_set)\n",
" }"
- ]
+ ],
+ "metadata": {
+ "collapsed": false,
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ }
},
{
"cell_type": "markdown",
+ "source": [
+ "We will now save our progress on disk."
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
"name": "#%% md\n"
}
- },
- "source": [
- "We will now save our progress on disk."
- ]
+ }
},
{
"cell_type": "code",
- "execution_count": 7,
- "metadata": {
- "pycharm": {
- "name": "#%%\n"
- }
- },
- "outputs": [
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "odict_keys(['features.0.weight', 'features.0.bias', 'features.2.weight', 'features.2.bias', 'features.5.weight', 'features.5.bias', 'features.7.weight', 'features.7.bias', 'features.10.weight', 'features.10.bias', 'features.12.weight', 'features.12.bias', 'features.14.weight', 'features.14.bias', 'features.17.weight', 'features.17.bias', 'features.19.weight', 'features.19.bias', 'features.21.weight', 'features.21.bias', 'features.24.weight', 'features.24.bias', 'features.26.weight', 'features.26.bias', 'features.28.weight', 'features.28.bias', 'classifier.0.weight', 'classifier.0.bias', 'classifier.3.weight', 'classifier.3.bias', 'classifier.6.weight', 'classifier.6.bias']) dict_keys(['labelled', 'random_state']) [103 89 135 ... 121 15 77]\n"
- ]
- }
- ],
+ "execution_count": null,
+ "outputs": [],
"source": [
"model_weight = model.state_dict()\n",
"dataset = active_set.state_dict()\n",
- "torch.save({'model':model_weight, 'dataset':dataset, 'labelling_progress':labelling_progress},\n",
+ "torch.save({'model': model_weight, 'dataset': dataset, 'labelling_progress': labelling_progress},\n",
" 'checkpoint.pth')\n",
"print(model.state_dict().keys(), dataset.keys(), labelling_progress)"
- ]
- },
- {
- "cell_type": "markdown",
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%% md\n"
+ "name": "#%%\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "markdown",
"source": [
"## Visualization\n",
"\n",
"Now that our active learning experiment is completed, we can visualize it!\n",
"\n",
"## Get t-SNE features.\n",
- "We will use MultiCoreTSNE to get a t-SNE representation of our dataset. This will allows us to visualize the progress."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 8,
+ "We will use scikit-learn to get a t-SNE representation of our dataset. This will allows us to visualize the progress."
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
- "name": "#%%\n"
+ "name": "#%% md\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
"outputs": [],
"source": [
"# modify our model to get features\n",
@@ -477,34 +445,42 @@
" def __init__(self, model):\n",
" super().__init__()\n",
" self.model = model\n",
+ "\n",
" def forward(self, x):\n",
- " return torch.flatten(self.model.features(x),1)\n",
- " \n",
+ " return torch.flatten(self.model.features(x), 1)\n",
+ "\n",
"\n",
"features = FeatureExtractor(model.model)\n",
"acc = []\n",
- "for x,y in DataLoader(active_set._dataset, batch_size=10):\n",
+ "for x, y in DataLoader(active_set._dataset, batch_size=10):\n",
" acc.append((features(x.cuda()).detach().cpu().numpy(), y.detach().cpu().numpy()))\n",
- " \n",
+ "\n",
"xs, ys = zip(*acc)"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 9,
+ ],
"metadata": {
+ "collapsed": false,
"pycharm": {
"name": "#%%\n"
}
- },
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
"outputs": [],
"source": [
- "from MulticoreTSNE import MulticoreTSNE as TSNE\n",
+ "from sklearn.manifold import TSNE\n",
"\n",
"# Compute t-SNE on the extracted features.\n",
"tsne = TSNE(n_jobs=4)\n",
"transformed = tsne.fit_transform(np.vstack(xs))"
- ]
+ ],
+ "metadata": {
+ "collapsed": false,
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ }
},
{
"cell_type": "code",
@@ -273101,6 +273077,7 @@
"import matplotlib.pyplot as plt\n",
"from matplotlib import animation\n",
"\n",
+ "\n",
"def plot_images(img_list):\n",
" def init():\n",
" img.set_data(img_list[0])\n",
@@ -273110,13 +273087,14 @@
" img.set_data(img_list[i])\n",
" return (img,)\n",
"\n",
- " fig = plt.Figure(figsize=(10,10))\n",
+ " fig = plt.Figure(figsize=(10, 10))\n",
" ax = fig.gca()\n",
" img = ax.imshow(img_list[0])\n",
" anim = animation.FuncAnimation(fig, animate, init_func=init,\n",
- " frames=len(img_list), interval=60, blit=True)\n",
+ " frames=len(img_list), interval=60, blit=True)\n",
" return anim\n",
"\n",
+ "\n",
"HTML(plot_images(frames).to_jshtml())"
]
},
diff --git a/poetry.lock b/poetry.lock
index d6e6b225..84ea91fc 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -281,13 +281,13 @@ xxhash = "*"
apache-beam = ["apache-beam (>=2.26.0)"]
audio = ["librosa"]
benchmarks = ["numpy (==1.18.5)", "tensorflow (==2.3.0)", "torch (==1.6.0)", "transformers (==3.0.2)"]
-dev = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[server,s3] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "black (>=22.0,<23.0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)", "importlib-resources"]
+dev = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[s3,server] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "black (>=22.0,<23.0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)", "importlib-resources"]
docs = ["docutils (==0.16.0)", "recommonmark", "sphinx (==3.1.2)", "sphinx-markdown-tables", "sphinx-rtd-theme (==0.4.3)", "sphinxext-opengraph (==0.4.1)", "sphinx-copybutton", "fsspec (<2021.9.0)", "s3fs", "sphinx-panels", "sphinx-inline-tabs", "myst-parser", "Markdown (!=3.3.5)"]
quality = ["black (>=22.0,<23.0)", "flake8 (>=3.8.3)", "isort (>=5.0.0)", "pyyaml (>=5.3.1)"]
s3 = ["fsspec", "boto3", "botocore", "s3fs"]
tensorflow = ["tensorflow (>=2.2.0,!=2.6.0,!=2.6.1)"]
tensorflow_gpu = ["tensorflow-gpu (>=2.2.0,!=2.6.0,!=2.6.1)"]
-tests = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[server,s3] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "importlib-resources"]
+tests = ["absl-py", "pytest", "pytest-datadir", "pytest-xdist", "apache-beam (>=2.26.0)", "elasticsearch (<8.0.0)", "aiobotocore", "boto3", "botocore", "faiss-cpu (>=1.6.4)", "fsspec", "moto[s3,server] (==2.0.4)", "rarfile (>=4.0)", "s3fs (==2021.08.1)", "tensorflow (>=2.3,!=2.6.0,!=2.6.1)", "torch", "torchaudio", "soundfile", "transformers", "bs4", "conllu", "h5py", "langdetect", "lxml", "mwparserfromhell", "nltk", "openpyxl", "py7zr", "tldextract", "zstandard", "bert-score (>=0.3.6)", "rouge-score", "sacrebleu", "scipy", "seqeval", "scikit-learn", "jiwer", "sentencepiece", "torchmetrics (==0.6.0)", "mauve-text", "toml (>=0.10.1)", "requests-file (>=1.5.1)", "tldextract (>=3.1.0)", "texttable (>=1.6.3)", "Werkzeug (>=1.0.1)", "six (>=1.15.0,<1.16.0)", "Pillow (>=6.2.1)", "librosa", "wget (>=3.2)", "pytorch-nlp (==0.5.0)", "pytorch-lightning", "fastBPE (==0.1.0)", "fairseq", "importlib-resources"]
torch = ["torch"]
vision = ["Pillow (>=6.2.1)"]
@@ -403,7 +403,7 @@ python-versions = ">=3.7"
[[package]]
name = "fsspec"
-version = "2022.5.0"
+version = "2022.7.1"
description = "File-system specification"
category = "main"
optional = true
@@ -414,27 +414,27 @@ aiohttp = {version = "*", optional = true, markers = "extra == \"http\""}
requests = {version = "*", optional = true, markers = "extra == \"http\""}
[package.extras]
-abfs = ["adlfs"]
-adl = ["adlfs"]
-arrow = ["pyarrow (>=1)"]
-dask = ["dask", "distributed"]
-dropbox = ["dropboxdrivefs", "requests", "dropbox"]
-entrypoints = ["importlib-metadata"]
-fuse = ["fusepy"]
-gcs = ["gcsfs"]
-git = ["pygit2"]
-github = ["requests"]
-gs = ["gcsfs"]
-gui = ["panel"]
-hdfs = ["pyarrow (>=1)"]
-http = ["requests", "aiohttp"]
-libarchive = ["libarchive-c"]
-oci = ["ocifs"]
-s3 = ["s3fs"]
-sftp = ["paramiko"]
-smb = ["smbprotocol"]
-ssh = ["paramiko"]
tqdm = ["tqdm"]
+ssh = ["paramiko"]
+smb = ["smbprotocol"]
+sftp = ["paramiko"]
+s3 = ["s3fs"]
+oci = ["ocifs"]
+libarchive = ["libarchive-c"]
+http = ["aiohttp", "requests"]
+hdfs = ["pyarrow (>=1)"]
+gui = ["panel"]
+gs = ["gcsfs"]
+github = ["requests"]
+git = ["pygit2"]
+gcs = ["gcsfs"]
+fuse = ["fusepy"]
+entrypoints = ["importlib-metadata"]
+dropbox = ["dropbox", "requests", "dropboxdrivefs"]
+dask = ["distributed", "dask"]
+arrow = ["pyarrow (>=1)"]
+adl = ["adlfs"]
+abfs = ["adlfs"]
[[package]]
name = "ghp-import"
@@ -667,7 +667,7 @@ python-versions = ">=3.6"
[[package]]
name = "jsonargparse"
-version = "4.11.0"
+version = "4.13.1"
description = "Parsing of command line options, yaml/jsonnet config files and/or environment variables based on argparse."
category = "main"
optional = true
@@ -678,22 +678,22 @@ docstring-parser = {version = ">=0.7.3", optional = true, markers = "extra == \"
PyYAML = ">=3.13"
[package.extras]
-all = ["docstring-parser (>=0.7.3)", "jsonschema (>=3.2.0)", "jsonnet (>=0.13.0)", "validators (>=0.14.2)", "requests (>=2.18.4)", "fsspec (>=0.8.4)", "argcomplete (>=2.0.0)", "ruyaml (>=0.20.0)", "omegaconf (>=2.1.1)", "reconplogger (>=4.4.0)", "typing-extensions (>=3.10.0.0)"]
-argcomplete = ["argcomplete (>=2.0.0)"]
-dev = ["coverage (>=4.5.1)", "responses (>=0.12.0)", "Sphinx (>=1.7.9)", "sphinx-rtd-theme (>=0.4.3)", "autodocsumm (>=0.1.10)", "sphinx-autodoc-typehints (>=1.11.1)", "pre-commit (>=2.19.0)", "pylint (>=1.8.3)", "pycodestyle (>=2.5.0)", "mypy (>=0.701)", "tox (>=3.25.0)"]
-doc = ["Sphinx (>=1.7.9)", "sphinx-rtd-theme (>=0.4.3)", "autodocsumm (>=0.1.10)", "sphinx-autodoc-typehints (>=1.11.1)"]
-fsspec = ["fsspec (>=0.8.4)"]
-jsonnet = ["jsonnet (>=0.13.0)"]
-jsonschema = ["jsonschema (>=3.2.0)"]
-maintainer = ["bump2version (>=0.5.11)", "twine (>=4.0.0)"]
-omegaconf = ["omegaconf (>=2.1.1)"]
-reconplogger = ["reconplogger (>=4.4.0)"]
-ruyaml = ["ruyaml (>=0.20.0)"]
-signatures = ["docstring-parser (>=0.7.3)"]
-test = ["coverage (>=4.5.1)", "responses (>=0.12.0)"]
-test_no_urls = ["coverage (>=4.5.1)"]
+urls = ["requests (>=2.18.4)", "validators (>=0.14.2)"]
typing_extensions = ["typing-extensions (>=3.10.0.0)"]
-urls = ["validators (>=0.14.2)", "requests (>=2.18.4)"]
+test_no_urls = ["coverage (>=4.5.1)"]
+test = ["responses (>=0.12.0)", "coverage (>=4.5.1)"]
+signatures = ["docstring-parser (>=0.7.3)"]
+ruyaml = ["ruyaml (>=0.20.0)"]
+reconplogger = ["reconplogger (>=4.4.0)"]
+omegaconf = ["omegaconf (>=2.1.1)"]
+maintainer = ["twine (>=4.0.0)", "bump2version (>=0.5.11)"]
+jsonschema = ["jsonschema (>=3.2.0)"]
+jsonnet = ["jsonnet (>=0.13.0)", "jsonnet-binary (>=0.17.0)"]
+fsspec = ["fsspec (>=0.8.4)"]
+doc = ["sphinx-autodoc-typehints (>=1.11.1)", "autodocsumm (>=0.1.10)", "sphinx-rtd-theme (>=0.4.3)", "Sphinx (>=1.7.9)"]
+dev = ["tox (>=3.25.0)", "mypy (>=0.701)", "pycodestyle (>=2.5.0)", "pylint (>=1.8.3)", "pre-commit (>=2.19.0)", "sphinx-autodoc-typehints (>=1.11.1)", "autodocsumm (>=0.1.10)", "sphinx-rtd-theme (>=0.4.3)", "Sphinx (>=1.7.9)", "responses (>=0.12.0)", "coverage (>=4.5.1)"]
+argcomplete = ["argcomplete (>=2.0.0)"]
+all = ["typing-extensions (>=3.10.0.0)", "jsonnet (>=0.13.0)", "jsonnet-binary (>=0.17.0)", "reconplogger (>=4.4.0)", "omegaconf (>=2.1.1)", "ruyaml (>=0.20.0)", "argcomplete (>=2.0.0)", "fsspec (>=0.8.4)", "requests (>=2.18.4)", "validators (>=0.14.2)", "jsonschema (>=3.2.0)", "docstring-parser (>=0.7.3)"]
[[package]]
name = "jsonschema"
@@ -973,6 +973,17 @@ python-versions = ">=3.7"
Markdown = ">=3.3"
mkdocs = ">=1.1"
+[[package]]
+name = "mkdocs-exclude-search"
+version = "0.6.4"
+description = "A mkdocs plugin that lets you exclude selected files or sections from the search index."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+
+[package.dependencies]
+mkdocs = ">=1.0.4"
+
[[package]]
name = "mkdocs-jupyter"
version = "0.21.0"
@@ -990,7 +1001,7 @@ Pygments = ">=2.12.0,<3.0.0"
[[package]]
name = "mkdocs-material"
-version = "8.3.9"
+version = "8.4.0"
description = "Documentation that simply works"
category = "dev"
optional = false
@@ -2085,7 +2096,7 @@ vision = ["torchvision", "lightning-flash"]
[metadata]
lock-version = "1.1"
python-versions = ">=3.7.1,<3.11"
-content-hash = "8b24bf08ccb4adbc089aba994c49152a88064877e50237bb5747f814f7dfd15b"
+content-hash = "0f8919f0be4fd420c39103f9cbb3cdec17927afc818eb63227ffeab7b606db51"
[metadata.files]
absl-py = []
@@ -2156,6 +2167,7 @@ mergedeep = []
mistune = []
mkdocs = []
mkdocs-autorefs = []
+mkdocs-exclude-search = []
mkdocs-jupyter = []
mkdocs-material = []
mkdocs-material-extensions = []
diff --git a/pyproject.toml b/pyproject.toml
index b39f4791..2a648105 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -52,6 +52,7 @@ mkdocs-jupyter = "^0.21.0"
mkdocs-material = "^8.3.9"
Pygments = "^2.12.0"
mkdocstrings = {extras = ["python"], version = "^0.18.1"}
+mkdocs-exclude-search = "^0.6.4"
[tool.poetry.extras]
vision = ["torchvision", "lightning-flash"]
From 338702b355696bb29b08bc36c9c29f53b1e21536 Mon Sep 17 00:00:00 2001
From: Dref360
Date: Sun, 9 Oct 2022 13:26:46 -0400
Subject: [PATCH 07/10] Update logo
---
docs/_static/css/default.css | 5116 ----------------------
docs/_static/images/logo-transparent.png | Bin 376940 -> 179037 bytes
docs/_static/images/logo-vertical.png | Bin 0 -> 235441 bytes
docs/_static/images/logo-with-bg.jpg | Bin 0 -> 285712 bytes
docs/index.md | 5 +
docs/stylesheets/extra.css | 3 +
mkdocs.yml | 8 +-
poetry.lock | 2 +-
pyproject.toml | 3 +-
9 files changed, 17 insertions(+), 5120 deletions(-)
delete mode 100644 docs/_static/css/default.css
create mode 100644 docs/_static/images/logo-vertical.png
create mode 100644 docs/_static/images/logo-with-bg.jpg
create mode 100644 docs/stylesheets/extra.css
diff --git a/docs/_static/css/default.css b/docs/_static/css/default.css
deleted file mode 100644
index 18dba0dd..00000000
--- a/docs/_static/css/default.css
+++ /dev/null
@@ -1,5116 +0,0 @@
-/*CSS from divio-docs-theme*/
-@charset "UTF-8";
-@import url("https://fonts.googleapis.com/css?family=Nunito:400:600");
-* {
- -webkit-box-sizing: border-box;
- -moz-box-sizing: border-box;
- box-sizing: border-box; }
-
-article, aside, details, figcaption, figure, footer, header, hgroup, nav, section {
- display: block; }
-
-audio, canvas, video {
- display: inline-block;
- *display: inline;
- *zoom: 1; }
-
-audio:not([controls]) {
- display: none; }
-
-[hidden] {
- display: none; }
-
-* {
- -webkit-box-sizing: border-box;
- -moz-box-sizing: border-box;
- box-sizing: border-box; }
-
-html {
- font-size: 100%;
- -webkit-text-size-adjust: 100%;
- -ms-text-size-adjust: 100%; }
-
-body {
- margin: 0;
- -webkit-font-smoothing: antialiased;
- -moz-osx-font-smoothing: grayscale; }
-
-a:hover, a:active {
- outline: 0; }
-
-abbr[title] {
- border-bottom: 1px dotted; }
-
-b, strong {
- font-weight: 600; }
-
-blockquote {
- margin: 0; }
-
-dfn {
- font-style: italic; }
-
-ins {
- background: #ff9;
- color: #000;
- text-decoration: none; }
-
-mark {
- background: #ff0;
- color: #000;
- font-style: italic;
- font-weight: 600; }
-
-pre, code, .rst-content tt, .rst-content code, kbd, samp {
- font-family: monospace, serif;
- _font-family: "courier new", monospace;
- font-size: 1em; }
-
-pre {
- white-space: pre; }
-
-q {
- quotes: none; }
-
-q:before, q:after {
- content: "";
- content: none; }
-
-small {
- font-size: 85%; }
-
-sub, sup {
- font-size: 75%;
- line-height: 0;
- position: relative;
- vertical-align: baseline; }
-
-sup {
- top: -0.5em; }
-
-sub {
- bottom: -0.25em; }
-
-ul, ol, dl {
- padding-left:20px;
- list-style-type: circle;
-}
-
-li {
- list-style: none; }
-
-dd {
- margin: 0; }
-
-img {
- border: 0;
- -ms-interpolation-mode: bicubic;
- vertical-align: middle;
- max-width: 100%; }
-
-svg:not(:root) {
- overflow: hidden; }
-
-figure {
- margin: 0; }
-
-form {
- margin: 0; }
-
-fieldset {
- border: 0;
- margin: 0;
- padding: 0; }
-
-label {
- cursor: pointer; }
-
-legend {
- border: 0;
- *margin-left: -7px;
- padding: 0;
- white-space: normal; }
-
-button, input, select, textarea {
- font-size: 100%;
- margin: 0;
- vertical-align: baseline;
- *vertical-align: middle; }
-
-button, input {
- line-height: normal; }
-
-button, input[type="button"], input[type="reset"], input[type="submit"] {
- cursor: pointer;
- -webkit-appearance: button;
- *overflow: visible; }
-
-button[disabled], input[disabled] {
- cursor: default; }
-
-input[type="checkbox"], input[type="radio"] {
- box-sizing: border-box;
- padding: 0;
- *width: 13px;
- *height: 13px; }
-
-input[type="search"] {
- -webkit-appearance: textfield;
- -moz-box-sizing: content-box;
- -webkit-box-sizing: content-box;
- box-sizing: content-box; }
-
-input[type="search"]::-webkit-search-decoration, input[type="search"]::-webkit-search-cancel-button {
- -webkit-appearance: none; }
-
-button::-moz-focus-inner, input::-moz-focus-inner {
- border: 0;
- padding: 0; }
-
-textarea {
- overflow: auto;
- vertical-align: top;
- resize: vertical; }
-
-table {
- border-collapse: collapse;
- border-spacing: 0; }
-
-td {
- vertical-align: top; }
-
-.chromeframe {
- margin: 0.2em 0;
- background: #ccc;
- color: black;
- padding: 0.2em 0; }
-
-.ir {
- display: block;
- border: 0;
- text-indent: -999em;
- overflow: hidden;
- background-color: transparent;
- background-repeat: no-repeat;
- text-align: left;
- direction: ltr;
- *line-height: 0; }
-
-.ir br {
- display: none; }
-
-.hidden {
- display: none !important;
- visibility: hidden; }
-
-.visuallyhidden {
- border: 0;
- clip: rect(0 0 0 0);
- height: 1px;
- margin: -1px;
- overflow: hidden;
- padding: 0;
- position: absolute;
- width: 1px; }
-
-.visuallyhidden.focusable:active, .visuallyhidden.focusable:focus {
- clip: auto;
- height: auto;
- margin: 0;
- overflow: visible;
- position: static;
- width: auto; }
-
-.invisible {
- visibility: hidden; }
-
-.relative {
- position: relative; }
-
-big, small {
- font-size: 100%; }
-
-@media print {
- html, body, section {
- background: none !important; }
- * {
- box-shadow: none !important;
- text-shadow: none !important;
- filter: none !important;
- -ms-filter: none !important; }
- a, a:visited {
- text-decoration: underline; }
- .ir a:after, a[href^="javascript:"]:after, a[href^="#"]:after {
- content: ""; }
- pre, blockquote {
- page-break-inside: avoid; }
- thead {
- display: table-header-group; }
- tr, img {
- page-break-inside: avoid; }
- img {
- max-width: 100% !important; }
- @page {
- margin: 0.5cm; }
- p, h2, .rst-content .toctree-wrapper p.caption, h3 {
- orphans: 3;
- widows: 3; }
- h2, .rst-content .toctree-wrapper p.caption, h3 {
- page-break-after: avoid; } }
-
-.fa:before, .wy-menu-vertical li span.toctree-expand:before, .wy-menu-vertical li.on a span.toctree-expand:before, .wy-menu-vertical li.current > a span.toctree-expand:before, .rst-content .admonition-title:before, .rst-content h1 .headerlink:before, .rst-content h2 .headerlink:before, .rst-content h3 .headerlink:before, .rst-content h4 .headerlink:before, .rst-content h5 .headerlink:before, .rst-content h6 .headerlink:before, .rst-content dl dt .headerlink:before, .rst-content p.caption .headerlink:before, .rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before, .icon:before, .wy-dropdown .caret:before, .wy-inline-validate.wy-inline-validate-success .wy-input-context:before, .wy-inline-validate.wy-inline-validate-danger .wy-input-context:before, .wy-inline-validate.wy-inline-validate-warning .wy-input-context:before, .wy-inline-validate.wy-inline-validate-info .wy-input-context:before, .wy-alert, .rst-content .note, .rst-content .attention, .rst-content .caution, .rst-content .danger, .rst-content .error, .rst-content .hint, .rst-content .important, .rst-content .tip, .rst-content .warning, .rst-content .seealso, .rst-content .admonition-todo, .rst-content .admonition, .btn, input[type="text"], input[type="password"], input[type="email"], input[type="url"], input[type="date"], input[type="month"], input[type="time"], input[type="datetime"], input[type="datetime-local"], input[type="week"], input[type="number"], input[type="search"], input[type="tel"], input[type="color"], select, textarea, .wy-menu-vertical li.on a, .wy-menu-vertical li.current > a, .wy-side-nav-search > a, .wy-side-nav-search .wy-dropdown > a, .wy-nav-top a {
- -webkit-font-smoothing: antialiased; }
-
-.clearfix {
- *zoom: 1; }
-
-.clearfix:before, .clearfix:after {
- display: table;
- content: ""; }
-
-.clearfix:after {
- clear: both; }
-
-/*!
- * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome
- * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License)
- */
-/* FONT PATH
- * -------------------------- */
-@font-face {
- font-family: 'FontAwesome';
- src: url("../fonts/fontawesome-webfont.eot?v=4.7.0");
- src: url("../fonts/fontawesome-webfont.eot?#iefix&v=4.7.0") format("embedded-opentype"), url("../fonts/fontawesome-webfont.woff2?v=4.7.0") format("woff2"), url("../fonts/fontawesome-webfont.woff?v=4.7.0") format("woff"), url("../fonts/fontawesome-webfont.ttf?v=4.7.0") format("truetype"), url("../fonts/fontawesome-webfont.svg?v=4.7.0#fontawesomeregular") format("svg");
- font-weight: normal;
- font-style: normal; }
-
-.fa, .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand, .rst-content .admonition-title, .rst-content h1 .headerlink, .rst-content h2 .headerlink, .rst-content h3 .headerlink, .rst-content h4 .headerlink, .rst-content h5 .headerlink, .rst-content h6 .headerlink, .rst-content dl dt .headerlink, .rst-content p.caption .headerlink, .rst-content tt.download span:first-child, .rst-content code.download span:first-child, .icon {
- display: inline-block;
- font: normal normal normal 14px/1 FontAwesome;
- font-size: inherit;
- text-rendering: auto;
- -webkit-font-smoothing: antialiased;
- -moz-osx-font-smoothing: grayscale; }
-
-/* makes the font 33% larger relative to the icon container */
-.fa-lg {
- font-size: 1.3333333333em;
- line-height: 0.75em;
- vertical-align: -15%; }
-
-.fa-2x {
- font-size: 2em; }
-
-.fa-3x {
- font-size: 3em; }
-
-.fa-4x {
- font-size: 4em; }
-
-.fa-5x {
- font-size: 5em; }
-
-.fa-fw {
- width: 1.2857142857em;
- text-align: center; }
-
-.fa-ul {
- padding-left: 0;
- margin-left: 2.1428571429em;
- list-style-type: circle; }
-
-.fa-ul > li {
- position: relative; }
-
-.fa-li {
- position: absolute;
- left: -2.1428571429em;
- width: 2.1428571429em;
- top: 0.1428571429em;
- text-align: center; }
-
-.fa-li.fa-lg {
- left: -1.8571428571em; }
-
-.fa-border {
- padding: .2em .25em .15em;
- border: solid 0.08em #eee;
- border-radius: .1em; }
-
-.fa-pull-left {
- float: left; }
-
-.fa-pull-right {
- float: right; }
-
-.fa.fa-pull-left, .wy-menu-vertical li span.fa-pull-left.toctree-expand, .wy-menu-vertical li.on a span.fa-pull-left.toctree-expand, .wy-menu-vertical li.current > a span.fa-pull-left.toctree-expand, .rst-content .fa-pull-left.admonition-title, .rst-content h1 .fa-pull-left.headerlink, .rst-content h2 .fa-pull-left.headerlink, .rst-content h3 .fa-pull-left.headerlink, .rst-content h4 .fa-pull-left.headerlink, .rst-content h5 .fa-pull-left.headerlink, .rst-content h6 .fa-pull-left.headerlink, .rst-content dl dt .fa-pull-left.headerlink, .rst-content p.caption .fa-pull-left.headerlink, .rst-content tt.download span.fa-pull-left:first-child, .rst-content code.download span.fa-pull-left:first-child, .fa-pull-left.icon {
- margin-right: .3em; }
-
-.fa.fa-pull-right, .wy-menu-vertical li span.fa-pull-right.toctree-expand, .wy-menu-vertical li.on a span.fa-pull-right.toctree-expand, .wy-menu-vertical li.current > a span.fa-pull-right.toctree-expand, .rst-content .fa-pull-right.admonition-title, .rst-content h1 .fa-pull-right.headerlink, .rst-content h2 .fa-pull-right.headerlink, .rst-content h3 .fa-pull-right.headerlink, .rst-content h4 .fa-pull-right.headerlink, .rst-content h5 .fa-pull-right.headerlink, .rst-content h6 .fa-pull-right.headerlink, .rst-content dl dt .fa-pull-right.headerlink, .rst-content p.caption .fa-pull-right.headerlink, .rst-content tt.download span.fa-pull-right:first-child, .rst-content code.download span.fa-pull-right:first-child, .fa-pull-right.icon {
- margin-left: .3em; }
-
-/* Deprecated as of 4.4.0 */
-.pull-right {
- float: right; }
-
-.pull-left {
- float: left; }
-
-.fa.pull-left, .wy-menu-vertical li span.pull-left.toctree-expand, .wy-menu-vertical li.on a span.pull-left.toctree-expand, .wy-menu-vertical li.current > a span.pull-left.toctree-expand, .rst-content .pull-left.admonition-title, .rst-content h1 .pull-left.headerlink, .rst-content h2 .pull-left.headerlink, .rst-content h3 .pull-left.headerlink, .rst-content h4 .pull-left.headerlink, .rst-content h5 .pull-left.headerlink, .rst-content h6 .pull-left.headerlink, .rst-content dl dt .pull-left.headerlink, .rst-content p.caption .pull-left.headerlink, .rst-content tt.download span.pull-left:first-child, .rst-content code.download span.pull-left:first-child, .pull-left.icon {
- margin-right: .3em; }
-
-.fa.pull-right, .wy-menu-vertical li span.pull-right.toctree-expand, .wy-menu-vertical li.on a span.pull-right.toctree-expand, .wy-menu-vertical li.current > a span.pull-right.toctree-expand, .rst-content .pull-right.admonition-title, .rst-content h1 .pull-right.headerlink, .rst-content h2 .pull-right.headerlink, .rst-content h3 .pull-right.headerlink, .rst-content h4 .pull-right.headerlink, .rst-content h5 .pull-right.headerlink, .rst-content h6 .pull-right.headerlink, .rst-content dl dt .pull-right.headerlink, .rst-content p.caption .pull-right.headerlink, .rst-content tt.download span.pull-right:first-child, .rst-content code.download span.pull-right:first-child, .pull-right.icon {
- margin-left: .3em; }
-
-.fa-spin {
- -webkit-animation: fa-spin 2s infinite linear;
- animation: fa-spin 2s infinite linear; }
-
-.fa-pulse {
- -webkit-animation: fa-spin 1s infinite steps(8);
- animation: fa-spin 1s infinite steps(8); }
-
-@-webkit-keyframes fa-spin {
- 0% {
- -webkit-transform: rotate(0deg);
- transform: rotate(0deg); }
- 100% {
- -webkit-transform: rotate(359deg);
- transform: rotate(359deg); } }
-
-@keyframes fa-spin {
- 0% {
- -webkit-transform: rotate(0deg);
- transform: rotate(0deg); }
- 100% {
- -webkit-transform: rotate(359deg);
- transform: rotate(359deg); } }
-
-.fa-rotate-90 {
- -ms-filter: "progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";
- -webkit-transform: rotate(90deg);
- -ms-transform: rotate(90deg);
- transform: rotate(90deg); }
-
-.fa-rotate-180 {
- -ms-filter: "progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";
- -webkit-transform: rotate(180deg);
- -ms-transform: rotate(180deg);
- transform: rotate(180deg); }
-
-.fa-rotate-270 {
- -ms-filter: "progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";
- -webkit-transform: rotate(270deg);
- -ms-transform: rotate(270deg);
- transform: rotate(270deg); }
-
-.fa-flip-horizontal {
- -ms-filter: "progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";
- -webkit-transform: scale(-1, 1);
- -ms-transform: scale(-1, 1);
- transform: scale(-1, 1); }
-
-.fa-flip-vertical {
- -ms-filter: "progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";
- -webkit-transform: scale(1, -1);
- -ms-transform: scale(1, -1);
- transform: scale(1, -1); }
-
-:root .fa-rotate-90,
-:root .fa-rotate-180,
-:root .fa-rotate-270,
-:root .fa-flip-horizontal,
-:root .fa-flip-vertical {
- filter: none; }
-
-.fa-stack {
- position: relative;
- display: inline-block;
- width: 2em;
- height: 2em;
- line-height: 2em;
- vertical-align: middle; }
-
-.fa-stack-1x, .fa-stack-2x {
- position: absolute;
- left: 0;
- width: 100%;
- text-align: center; }
-
-.fa-stack-1x {
- line-height: inherit; }
-
-.fa-stack-2x {
- font-size: 2em; }
-
-.fa-inverse {
- color: #fff; }
-
-/* Font Awesome uses the Unicode Private Use Area (PUA) to ensure screen
- readers do not read off random characters that represent icons */
-.fa-glass:before {
- content: ""; }
-
-.fa-music:before {
- content: ""; }
-
-.fa-search:before, .icon-search:before {
- content: ""; }
-
-.fa-envelope-o:before {
- content: ""; }
-
-.fa-heart:before {
- content: ""; }
-
-.fa-star:before {
- content: ""; }
-
-.fa-star-o:before {
- content: ""; }
-
-.fa-user:before {
- content: ""; }
-
-.fa-film:before {
- content: ""; }
-
-.fa-th-large:before {
- content: ""; }
-
-.fa-th:before {
- content: ""; }
-
-.fa-th-list:before {
- content: ""; }
-
-.fa-check:before {
- content: ""; }
-
-.fa-remove:before,
-.fa-close:before,
-.fa-times:before {
- content: ""; }
-
-.fa-search-plus:before {
- content: ""; }
-
-.fa-search-minus:before {
- content: ""; }
-
-.fa-power-off:before {
- content: ""; }
-
-.fa-signal:before {
- content: ""; }
-
-.fa-gear:before,
-.fa-cog:before {
- content: ""; }
-
-.fa-trash-o:before {
- content: ""; }
-
-.fa-home:before, .icon-home:before {
- content: ""; }
-
-.fa-file-o:before {
- content: ""; }
-
-.fa-clock-o:before {
- content: ""; }
-
-.fa-road:before {
- content: ""; }
-
-.fa-download:before, .rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before {
- content: ""; }
-
-.fa-arrow-circle-o-down:before {
- content: ""; }
-
-.fa-arrow-circle-o-up:before {
- content: ""; }
-
-.fa-inbox:before {
- content: ""; }
-
-.fa-play-circle-o:before {
- content: ""; }
-
-.fa-rotate-right:before,
-.fa-repeat:before {
- content: ""; }
-
-.fa-refresh:before {
- content: ""; }
-
-.fa-list-alt:before {
- content: ""; }
-
-.fa-lock:before {
- content: ""; }
-
-.fa-flag:before {
- content: ""; }
-
-.fa-headphones:before {
- content: ""; }
-
-.fa-volume-off:before {
- content: ""; }
-
-.fa-volume-down:before {
- content: ""; }
-
-.fa-volume-up:before {
- content: ""; }
-
-.fa-qrcode:before {
- content: ""; }
-
-.fa-barcode:before {
- content: ""; }
-
-.fa-tag:before {
- content: ""; }
-
-.fa-tags:before {
- content: ""; }
-
-.fa-book:before, .icon-book:before {
- content: ""; }
-
-.fa-bookmark:before {
- content: ""; }
-
-.fa-print:before {
- content: ""; }
-
-.fa-camera:before {
- content: ""; }
-
-.fa-font:before {
- content: ""; }
-
-.fa-bold:before {
- content: ""; }
-
-.fa-italic:before {
- content: ""; }
-
-.fa-text-height:before {
- content: ""; }
-
-.fa-text-width:before {
- content: ""; }
-
-.fa-align-left:before {
- content: ""; }
-
-.fa-align-center:before {
- content: ""; }
-
-.fa-align-right:before {
- content: ""; }
-
-.fa-align-justify:before {
- content: ""; }
-
-.fa-list:before {
- content: ""; }
-
-.fa-dedent:before,
-.fa-outdent:before {
- content: ""; }
-
-.fa-indent:before {
- content: ""; }
-
-.fa-video-camera:before {
- content: ""; }
-
-.fa-photo:before,
-.fa-image:before,
-.fa-picture-o:before {
- content: ""; }
-
-.fa-pencil:before {
- content: ""; }
-
-.fa-map-marker:before {
- content: ""; }
-
-.fa-adjust:before {
- content: ""; }
-
-.fa-tint:before {
- content: ""; }
-
-.fa-edit:before,
-.fa-pencil-square-o:before {
- content: ""; }
-
-.fa-share-square-o:before {
- content: ""; }
-
-.fa-check-square-o:before {
- content: ""; }
-
-.fa-arrows:before {
- content: ""; }
-
-.fa-step-backward:before {
- content: ""; }
-
-.fa-fast-backward:before {
- content: ""; }
-
-.fa-backward:before {
- content: ""; }
-
-.fa-play:before {
- content: ""; }
-
-.fa-pause:before {
- content: ""; }
-
-.fa-stop:before {
- content: ""; }
-
-.fa-forward:before {
- content: ""; }
-
-.fa-fast-forward:before {
- content: ""; }
-
-.fa-step-forward:before {
- content: ""; }
-
-.fa-eject:before {
- content: ""; }
-
-.fa-chevron-left:before {
- content: ""; }
-
-.fa-chevron-right:before,
-.wy-menu-vertical li span.toctree-expand:before {
- content: ""; }
-
-.fa-plus-circle:before {
- content: ""; }
-
-.fa-minus-circle:before {
- content: ""; }
-
-.fa-times-circle:before, .wy-inline-validate.wy-inline-validate-danger .wy-input-context:before {
- content: ""; }
-
-.fa-check-circle:before, .wy-inline-validate.wy-inline-validate-success .wy-input-context:before {
- content: ""; }
-
-.fa-question-circle:before {
- content: ""; }
-
-.fa-info-circle:before {
- content: ""; }
-
-.fa-crosshairs:before {
- content: ""; }
-
-.fa-times-circle-o:before {
- content: ""; }
-
-.fa-check-circle-o:before {
- content: ""; }
-
-.fa-ban:before {
- content: ""; }
-
-.fa-arrow-left:before {
- content: ""; }
-
-.fa-arrow-right:before {
- content: ""; }
-
-.fa-arrow-up:before {
- content: ""; }
-
-.fa-arrow-down:before {
- content: ""; }
-
-.fa-mail-forward:before,
-.fa-share:before {
- content: ""; }
-
-.fa-expand:before {
- content: ""; }
-
-.fa-compress:before {
- content: ""; }
-
-.fa-plus:before {
- content: ""; }
-
-.fa-minus:before {
- content: ""; }
-
-.fa-asterisk:before {
- content: ""; }
-
-.fa-exclamation-circle:before, .wy-inline-validate.wy-inline-validate-warning .wy-input-context:before, .wy-inline-validate.wy-inline-validate-info .wy-input-context:before, .rst-content .admonition-title:before {
- content: ""; }
-
-.fa-gift:before {
- content: ""; }
-
-.fa-leaf:before {
- content: ""; }
-
-.fa-fire:before, .icon-fire:before {
- content: ""; }
-
-.fa-eye:before {
- content: ""; }
-
-.fa-eye-slash:before {
- content: ""; }
-
-.fa-warning:before,
-.fa-exclamation-triangle:before {
- content: ""; }
-
-.fa-plane:before {
- content: ""; }
-
-.fa-calendar:before {
- content: ""; }
-
-.fa-random:before {
- content: ""; }
-
-.fa-comment:before {
- content: ""; }
-
-.fa-magnet:before {
- content: ""; }
-
-.fa-chevron-up:before {
- content: ""; }
-
-.fa-chevron-down:before,
-.wy-menu-vertical li.on a span.toctree-expand:before,
-.wy-menu-vertical li.current > a span.toctree-expand:before {
- content: ""; }
-
-.fa-retweet:before {
- content: ""; }
-
-.fa-shopping-cart:before {
- content: ""; }
-
-.fa-folder:before {
- content: ""; }
-
-.fa-folder-open:before {
- content: ""; }
-
-.fa-arrows-v:before {
- content: ""; }
-
-.fa-arrows-h:before {
- content: ""; }
-
-.fa-bar-chart-o:before,
-.fa-bar-chart:before {
- content: ""; }
-
-.fa-twitter-square:before {
- content: ""; }
-
-.fa-facebook-square:before {
- content: ""; }
-
-.fa-camera-retro:before {
- content: ""; }
-
-.fa-key:before {
- content: ""; }
-
-.fa-gears:before,
-.fa-cogs:before {
- content: ""; }
-
-.fa-comments:before {
- content: ""; }
-
-.fa-thumbs-o-up:before {
- content: ""; }
-
-.fa-thumbs-o-down:before {
- content: ""; }
-
-.fa-star-half:before {
- content: ""; }
-
-.fa-heart-o:before {
- content: ""; }
-
-.fa-sign-out:before {
- content: ""; }
-
-.fa-linkedin-square:before {
- content: ""; }
-
-.fa-thumb-tack:before {
- content: ""; }
-
-.fa-external-link:before {
- content: ""; }
-
-.fa-sign-in:before {
- content: ""; }
-
-.fa-trophy:before {
- content: ""; }
-
-.fa-github-square:before {
- content: ""; }
-
-.fa-upload:before {
- content: ""; }
-
-.fa-lemon-o:before {
- content: ""; }
-
-.fa-phone:before {
- content: ""; }
-
-.fa-square-o:before {
- content: ""; }
-
-.fa-bookmark-o:before {
- content: ""; }
-
-.fa-phone-square:before {
- content: ""; }
-
-.fa-twitter:before {
- content: ""; }
-
-.fa-facebook-f:before,
-.fa-facebook:before {
- content: ""; }
-
-.fa-github:before, .icon-github:before {
- content: ""; }
-
-.fa-unlock:before {
- content: ""; }
-
-.fa-credit-card:before {
- content: ""; }
-
-.fa-feed:before,
-.fa-rss:before {
- content: ""; }
-
-.fa-hdd-o:before {
- content: ""; }
-
-.fa-bullhorn:before {
- content: ""; }
-
-.fa-bell:before {
- content: ""; }
-
-.fa-certificate:before {
- content: ""; }
-
-.fa-hand-o-right:before {
- content: ""; }
-
-.fa-hand-o-left:before {
- content: ""; }
-
-.fa-hand-o-up:before {
- content: ""; }
-
-.fa-hand-o-down:before {
- content: ""; }
-
-.fa-arrow-circle-left:before, .icon-circle-arrow-left:before {
- margin: 2px 5px 1px 1px;
- content: ""; }
-
-.fa-arrow-circle-right:before, .icon-circle-arrow-right:before {
- margin: 2px 1px 1px 5px;
- content: ""; }
-
-.fa-arrow-circle-up:before {
- content: ""; }
-
-.fa-arrow-circle-down:before {
- content: ""; }
-
-.fa-globe:before {
- content: ""; }
-
-.fa-wrench:before {
- content: ""; }
-
-.fa-tasks:before {
- content: ""; }
-
-.fa-filter:before {
- content: ""; }
-
-.fa-briefcase:before {
- content: ""; }
-
-.fa-arrows-alt:before {
- content: ""; }
-
-.fa-group:before,
-.fa-users:before {
- content: ""; }
-
-.fa-chain:before,
-.fa-link:before,
-.icon-link:before {
- content: ""; }
-
-.fa-cloud:before {
- content: ""; }
-
-.fa-flask:before {
- content: ""; }
-
-.fa-cut:before,
-.fa-scissors:before {
- content: ""; }
-
-.fa-copy:before,
-.fa-files-o:before {
- content: ""; }
-
-.fa-paperclip:before {
- content: ""; }
-
-.fa-save:before,
-.fa-floppy-o:before {
- content: ""; }
-
-.fa-square:before {
- content: ""; }
-
-.fa-navicon:before,
-.fa-reorder:before,
-.fa-bars:before {
- content: ""; }
-
-.fa-list-ul:before {
- content: ""; }
-
-.fa-list-ol:before {
- content: ""; }
-
-.fa-strikethrough:before {
- content: ""; }
-
-.fa-underline:before {
- content: ""; }
-
-.fa-table:before {
- content: ""; }
-
-.fa-magic:before {
- content: ""; }
-
-.fa-truck:before {
- content: ""; }
-
-.fa-pinterest:before {
- content: ""; }
-
-.fa-pinterest-square:before {
- content: ""; }
-
-.fa-google-plus-square:before {
- content: ""; }
-
-.fa-google-plus:before {
- content: ""; }
-
-.fa-money:before {
- content: ""; }
-
-.fa-caret-down:before, .wy-dropdown .caret:before, .icon-caret-down:before {
- content: ""; }
-
-.fa-caret-up:before {
- content: ""; }
-
-.fa-caret-left:before {
- content: ""; }
-
-.fa-caret-right:before {
- content: ""; }
-
-.fa-columns:before {
- content: ""; }
-
-.fa-unsorted:before,
-.fa-sort:before {
- content: ""; }
-
-.fa-sort-down:before,
-.fa-sort-desc:before {
- content: ""; }
-
-.fa-sort-up:before,
-.fa-sort-asc:before {
- content: ""; }
-
-.fa-envelope:before {
- content: ""; }
-
-.fa-linkedin:before {
- content: ""; }
-
-.fa-rotate-left:before,
-.fa-undo:before {
- content: ""; }
-
-.fa-legal:before,
-.fa-gavel:before {
- content: ""; }
-
-.fa-dashboard:before,
-.fa-tachometer:before {
- content: ""; }
-
-.fa-comment-o:before {
- content: ""; }
-
-.fa-comments-o:before {
- content: ""; }
-
-.fa-flash:before,
-.fa-bolt:before {
- content: ""; }
-
-.fa-sitemap:before {
- content: ""; }
-
-.fa-umbrella:before {
- content: ""; }
-
-.fa-paste:before,
-.fa-clipboard:before {
- content: ""; }
-
-.fa-lightbulb-o:before {
- content: ""; }
-
-.fa-exchange:before {
- content: ""; }
-
-.fa-cloud-download:before {
- content: ""; }
-
-.fa-cloud-upload:before {
- content: ""; }
-
-.fa-user-md:before {
- content: ""; }
-
-.fa-stethoscope:before {
- content: ""; }
-
-.fa-suitcase:before {
- content: ""; }
-
-.fa-bell-o:before {
- content: ""; }
-
-.fa-coffee:before {
- content: ""; }
-
-.fa-cutlery:before {
- content: ""; }
-
-.fa-file-text-o:before {
- content: ""; }
-
-.fa-building-o:before {
- content: ""; }
-
-.fa-hospital-o:before {
- content: ""; }
-
-.fa-ambulance:before {
- content: ""; }
-
-.fa-medkit:before {
- content: ""; }
-
-.fa-fighter-jet:before {
- content: ""; }
-
-.fa-beer:before {
- content: ""; }
-
-.fa-h-square:before {
- content: ""; }
-
-.fa-plus-square:before {
- content: ""; }
-
-.fa-angle-double-left:before {
- content: ""; }
-
-.fa-angle-double-right:before {
- content: ""; }
-
-.fa-angle-double-up:before {
- content: ""; }
-
-.fa-angle-double-down:before {
- content: ""; }
-
-.fa-angle-left:before {
- content: ""; }
-
-.fa-angle-right:before {
- content: ""; }
-
-.fa-angle-up:before {
- content: ""; }
-
-.fa-angle-down:before {
- content: ""; }
-
-.fa-desktop:before {
- content: ""; }
-
-.fa-laptop:before {
- content: ""; }
-
-.fa-tablet:before {
- content: ""; }
-
-.fa-mobile-phone:before,
-.fa-mobile:before {
- content: ""; }
-
-.fa-circle-o:before {
- content: ""; }
-
-.fa-quote-left:before {
- content: ""; }
-
-.fa-quote-right:before {
- content: ""; }
-
-.fa-spinner:before {
- content: ""; }
-
-.fa-circle:before {
- content: ""; }
-
-.fa-mail-reply:before,
-.fa-reply:before {
- content: ""; }
-
-.fa-github-alt:before {
- content: ""; }
-
-.fa-folder-o:before {
- content: ""; }
-
-.fa-folder-open-o:before {
- content: ""; }
-
-.fa-smile-o:before {
- content: ""; }
-
-.fa-frown-o:before {
- content: ""; }
-
-.fa-meh-o:before {
- content: ""; }
-
-.fa-gamepad:before {
- content: ""; }
-
-.fa-keyboard-o:before {
- content: ""; }
-
-.fa-flag-o:before {
- content: ""; }
-
-.fa-flag-checkered:before {
- content: ""; }
-
-.fa-terminal:before {
- content: ""; }
-
-.fa-code:before {
- content: ""; }
-
-.fa-mail-reply-all:before,
-.fa-reply-all:before {
- content: ""; }
-
-.fa-star-half-empty:before,
-.fa-star-half-full:before,
-.fa-star-half-o:before {
- content: ""; }
-
-.fa-location-arrow:before {
- content: ""; }
-
-.fa-crop:before {
- content: ""; }
-
-.fa-code-fork:before {
- content: ""; }
-
-.fa-unlink:before,
-.fa-chain-broken:before {
- content: ""; }
-
-.fa-question:before {
- content: ""; }
-
-.fa-info:before {
- content: ""; }
-
-.fa-exclamation:before {
- content: ""; }
-
-.fa-superscript:before {
- content: ""; }
-
-.fa-subscript:before {
- content: ""; }
-
-.fa-eraser:before {
- content: ""; }
-
-.fa-puzzle-piece:before {
- content: ""; }
-
-.fa-microphone:before {
- content: ""; }
-
-.fa-microphone-slash:before {
- content: ""; }
-
-.fa-shield:before {
- content: ""; }
-
-.fa-calendar-o:before {
- content: ""; }
-
-.fa-fire-extinguisher:before {
- content: ""; }
-
-.fa-rocket:before {
- content: ""; }
-
-.fa-maxcdn:before {
- content: ""; }
-
-.fa-chevron-circle-left:before {
- content: ""; }
-
-.fa-chevron-circle-right:before {
- content: ""; }
-
-.fa-chevron-circle-up:before {
- content: ""; }
-
-.fa-chevron-circle-down:before {
- content: ""; }
-
-.fa-html5:before {
- content: ""; }
-
-.fa-css3:before {
- content: ""; }
-
-.fa-anchor:before {
- content: ""; }
-
-.fa-unlock-alt:before {
- content: ""; }
-
-.fa-bullseye:before {
- content: ""; }
-
-.fa-ellipsis-h:before {
- content: ""; }
-
-.fa-ellipsis-v:before {
- content: ""; }
-
-.fa-rss-square:before {
- content: ""; }
-
-.fa-play-circle:before {
- content: ""; }
-
-.fa-ticket:before {
- content: ""; }
-
-.fa-minus-square:before {
- content: ""; }
-
-.fa-minus-square-o:before {
- content: ""; }
-
-.fa-level-up:before {
- content: ""; }
-
-.fa-level-down:before {
- content: ""; }
-
-.fa-check-square:before {
- content: ""; }
-
-.fa-pencil-square:before {
- content: ""; }
-
-.fa-external-link-square:before {
- content: ""; }
-
-.fa-share-square:before {
- content: ""; }
-
-.fa-compass:before {
- content: ""; }
-
-.fa-toggle-down:before,
-.fa-caret-square-o-down:before {
- content: ""; }
-
-.fa-toggle-up:before,
-.fa-caret-square-o-up:before {
- content: ""; }
-
-.fa-toggle-right:before,
-.fa-caret-square-o-right:before {
- content: ""; }
-
-.fa-euro:before,
-.fa-eur:before {
- content: ""; }
-
-.fa-gbp:before {
- content: ""; }
-
-.fa-dollar:before,
-.fa-usd:before {
- content: ""; }
-
-.fa-rupee:before,
-.fa-inr:before {
- content: ""; }
-
-.fa-cny:before,
-.fa-rmb:before,
-.fa-yen:before,
-.fa-jpy:before {
- content: ""; }
-
-.fa-ruble:before,
-.fa-rouble:before,
-.fa-rub:before {
- content: ""; }
-
-.fa-won:before,
-.fa-krw:before {
- content: ""; }
-
-.fa-bitcoin:before,
-.fa-btc:before {
- content: ""; }
-
-.fa-file:before {
- content: ""; }
-
-.fa-file-text:before {
- content: ""; }
-
-.fa-sort-alpha-asc:before {
- content: ""; }
-
-.fa-sort-alpha-desc:before {
- content: ""; }
-
-.fa-sort-amount-asc:before {
- content: ""; }
-
-.fa-sort-amount-desc:before {
- content: ""; }
-
-.fa-sort-numeric-asc:before {
- content: ""; }
-
-.fa-sort-numeric-desc:before {
- content: ""; }
-
-.fa-thumbs-up:before {
- content: ""; }
-
-.fa-thumbs-down:before {
- content: ""; }
-
-.fa-youtube-square:before {
- content: ""; }
-
-.fa-youtube:before {
- content: ""; }
-
-.fa-xing:before {
- content: ""; }
-
-.fa-xing-square:before {
- content: ""; }
-
-.fa-youtube-play:before {
- content: ""; }
-
-.fa-dropbox:before {
- content: ""; }
-
-.fa-stack-overflow:before {
- content: ""; }
-
-.fa-instagram:before {
- content: ""; }
-
-.fa-flickr:before {
- content: ""; }
-
-.fa-adn:before {
- content: ""; }
-
-.fa-bitbucket:before, .icon-bitbucket:before {
- content: ""; }
-
-.fa-bitbucket-square:before {
- content: ""; }
-
-.fa-tumblr:before {
- content: ""; }
-
-.fa-tumblr-square:before {
- content: ""; }
-
-.fa-long-arrow-down:before {
- content: ""; }
-
-.fa-long-arrow-up:before {
- content: ""; }
-
-.fa-long-arrow-left:before {
- content: ""; }
-
-.fa-long-arrow-right:before {
- content: ""; }
-
-.fa-apple:before {
- content: ""; }
-
-.fa-windows:before {
- content: ""; }
-
-.fa-android:before {
- content: ""; }
-
-.fa-linux:before {
- content: ""; }
-
-.fa-dribbble:before {
- content: ""; }
-
-.fa-skype:before {
- content: ""; }
-
-.fa-foursquare:before {
- content: ""; }
-
-.fa-trello:before {
- content: ""; }
-
-.fa-female:before {
- content: ""; }
-
-.fa-male:before {
- content: ""; }
-
-.fa-gittip:before,
-.fa-gratipay:before {
- content: ""; }
-
-.fa-sun-o:before {
- content: ""; }
-
-.fa-moon-o:before {
- content: ""; }
-
-.fa-archive:before {
- content: ""; }
-
-.fa-bug:before {
- content: ""; }
-
-.fa-vk:before {
- content: ""; }
-
-.fa-weibo:before {
- content: ""; }
-
-.fa-renren:before {
- content: ""; }
-
-.fa-pagelines:before {
- content: ""; }
-
-.fa-stack-exchange:before {
- content: ""; }
-
-.fa-arrow-circle-o-right:before {
- content: ""; }
-
-.fa-arrow-circle-o-left:before {
- content: ""; }
-
-.fa-toggle-left:before,
-.fa-caret-square-o-left:before {
- content: ""; }
-
-.fa-dot-circle-o:before {
- content: ""; }
-
-.fa-wheelchair:before {
- content: ""; }
-
-.fa-vimeo-square:before {
- content: ""; }
-
-.fa-turkish-lira:before,
-.fa-try:before {
- content: ""; }
-
-.fa-plus-square-o:before {
- content: ""; }
-
-.fa-space-shuttle:before {
- content: ""; }
-
-.fa-slack:before {
- content: ""; }
-
-.fa-envelope-square:before {
- content: ""; }
-
-.fa-wordpress:before {
- content: ""; }
-
-.fa-openid:before {
- content: ""; }
-
-.fa-institution:before,
-.fa-bank:before,
-.fa-university:before {
- content: ""; }
-
-.fa-mortar-board:before,
-.fa-graduation-cap:before {
- content: ""; }
-
-.fa-yahoo:before {
- content: ""; }
-
-.fa-google:before {
- content: ""; }
-
-.fa-reddit:before {
- content: ""; }
-
-.fa-reddit-square:before {
- content: ""; }
-
-.fa-stumbleupon-circle:before {
- content: ""; }
-
-.fa-stumbleupon:before {
- content: ""; }
-
-.fa-delicious:before {
- content: ""; }
-
-.fa-digg:before {
- content: ""; }
-
-.fa-pied-piper-pp:before {
- content: ""; }
-
-.fa-pied-piper-alt:before {
- content: ""; }
-
-.fa-drupal:before {
- content: ""; }
-
-.fa-joomla:before {
- content: ""; }
-
-.fa-language:before {
- content: ""; }
-
-.fa-fax:before {
- content: ""; }
-
-.fa-building:before {
- content: ""; }
-
-.fa-child:before {
- content: ""; }
-
-.fa-paw:before {
- content: ""; }
-
-.fa-spoon:before {
- content: ""; }
-
-.fa-cube:before {
- content: ""; }
-
-.fa-cubes:before {
- content: ""; }
-
-.fa-behance:before {
- content: ""; }
-
-.fa-behance-square:before {
- content: ""; }
-
-.fa-steam:before {
- content: ""; }
-
-.fa-steam-square:before {
- content: ""; }
-
-.fa-recycle:before {
- content: ""; }
-
-.fa-automobile:before,
-.fa-car:before {
- content: ""; }
-
-.fa-cab:before,
-.fa-taxi:before {
- content: ""; }
-
-.fa-tree:before {
- content: ""; }
-
-.fa-spotify:before {
- content: ""; }
-
-.fa-deviantart:before {
- content: ""; }
-
-.fa-soundcloud:before {
- content: ""; }
-
-.fa-database:before {
- content: ""; }
-
-.fa-file-pdf-o:before {
- content: ""; }
-
-.fa-file-word-o:before {
- content: ""; }
-
-.fa-file-excel-o:before {
- content: ""; }
-
-.fa-file-powerpoint-o:before {
- content: ""; }
-
-.fa-file-photo-o:before,
-.fa-file-picture-o:before,
-.fa-file-image-o:before {
- content: ""; }
-
-.fa-file-zip-o:before,
-.fa-file-archive-o:before {
- content: ""; }
-
-.fa-file-sound-o:before,
-.fa-file-audio-o:before {
- content: ""; }
-
-.fa-file-movie-o:before,
-.fa-file-video-o:before {
- content: ""; }
-
-.fa-file-code-o:before {
- content: ""; }
-
-.fa-vine:before {
- content: ""; }
-
-.fa-codepen:before {
- content: ""; }
-
-.fa-jsfiddle:before {
- content: ""; }
-
-.fa-life-bouy:before,
-.fa-life-buoy:before,
-.fa-life-saver:before,
-.fa-support:before,
-.fa-life-ring:before {
- content: ""; }
-
-.fa-circle-o-notch:before {
- content: ""; }
-
-.fa-ra:before,
-.fa-resistance:before,
-.fa-rebel:before {
- content: ""; }
-
-.fa-ge:before,
-.fa-empire:before {
- content: ""; }
-
-.fa-git-square:before {
- content: ""; }
-
-.fa-git:before {
- content: ""; }
-
-.fa-y-combinator-square:before,
-.fa-yc-square:before,
-.fa-hacker-news:before {
- content: ""; }
-
-.fa-tencent-weibo:before {
- content: ""; }
-
-.fa-qq:before {
- content: ""; }
-
-.fa-wechat:before,
-.fa-weixin:before {
- content: ""; }
-
-.fa-send:before,
-.fa-paper-plane:before {
- content: ""; }
-
-.fa-send-o:before,
-.fa-paper-plane-o:before {
- content: ""; }
-
-.fa-history:before {
- content: ""; }
-
-.fa-circle-thin:before {
- content: ""; }
-
-.fa-header:before {
- content: ""; }
-
-.fa-paragraph:before {
- content: ""; }
-
-.fa-sliders:before {
- content: ""; }
-
-.fa-share-alt:before {
- content: ""; }
-
-.fa-share-alt-square:before {
- content: ""; }
-
-.fa-bomb:before {
- content: ""; }
-
-.fa-soccer-ball-o:before,
-.fa-futbol-o:before {
- content: ""; }
-
-.fa-tty:before {
- content: ""; }
-
-.fa-binoculars:before {
- content: ""; }
-
-.fa-plug:before {
- content: ""; }
-
-.fa-slideshare:before {
- content: ""; }
-
-.fa-twitch:before {
- content: ""; }
-
-.fa-yelp:before {
- content: ""; }
-
-.fa-newspaper-o:before {
- content: ""; }
-
-.fa-wifi:before {
- content: ""; }
-
-.fa-calculator:before {
- content: ""; }
-
-.fa-paypal:before {
- content: ""; }
-
-.fa-google-wallet:before {
- content: ""; }
-
-.fa-cc-visa:before {
- content: ""; }
-
-.fa-cc-mastercard:before {
- content: ""; }
-
-.fa-cc-discover:before {
- content: ""; }
-
-.fa-cc-amex:before {
- content: ""; }
-
-.fa-cc-paypal:before {
- content: ""; }
-
-.fa-cc-stripe:before {
- content: ""; }
-
-.fa-bell-slash:before {
- content: ""; }
-
-.fa-bell-slash-o:before {
- content: ""; }
-
-.fa-trash:before {
- content: ""; }
-
-.fa-copyright:before {
- content: ""; }
-
-.fa-at:before {
- content: ""; }
-
-.fa-eyedropper:before {
- content: ""; }
-
-.fa-paint-brush:before {
- content: ""; }
-
-.fa-birthday-cake:before {
- content: ""; }
-
-.fa-area-chart:before {
- content: ""; }
-
-.fa-pie-chart:before {
- content: ""; }
-
-.fa-line-chart:before {
- content: ""; }
-
-.fa-lastfm:before {
- content: ""; }
-
-.fa-lastfm-square:before {
- content: ""; }
-
-.fa-toggle-off:before {
- content: ""; }
-
-.fa-toggle-on:before {
- content: ""; }
-
-.fa-bicycle:before {
- content: ""; }
-
-.fa-bus:before {
- content: ""; }
-
-.fa-ioxhost:before {
- content: ""; }
-
-.fa-angellist:before {
- content: ""; }
-
-.fa-cc:before {
- content: ""; }
-
-.fa-shekel:before,
-.fa-sheqel:before,
-.fa-ils:before {
- content: ""; }
-
-.fa-meanpath:before {
- content: ""; }
-
-.fa-buysellads:before {
- content: ""; }
-
-.fa-connectdevelop:before {
- content: ""; }
-
-.fa-dashcube:before {
- content: ""; }
-
-.fa-forumbee:before {
- content: ""; }
-
-.fa-leanpub:before {
- content: ""; }
-
-.fa-sellsy:before {
- content: ""; }
-
-.fa-shirtsinbulk:before {
- content: ""; }
-
-.fa-simplybuilt:before {
- content: ""; }
-
-.fa-skyatlas:before {
- content: ""; }
-
-.fa-cart-plus:before {
- content: ""; }
-
-.fa-cart-arrow-down:before {
- content: ""; }
-
-.fa-diamond:before {
- content: ""; }
-
-.fa-ship:before {
- content: ""; }
-
-.fa-user-secret:before {
- content: ""; }
-
-.fa-motorcycle:before {
- content: ""; }
-
-.fa-street-view:before {
- content: ""; }
-
-.fa-heartbeat:before {
- content: ""; }
-
-.fa-venus:before {
- content: ""; }
-
-.fa-mars:before {
- content: ""; }
-
-.fa-mercury:before {
- content: ""; }
-
-.fa-intersex:before,
-.fa-transgender:before {
- content: ""; }
-
-.fa-transgender-alt:before {
- content: ""; }
-
-.fa-venus-double:before {
- content: ""; }
-
-.fa-mars-double:before {
- content: ""; }
-
-.fa-venus-mars:before {
- content: ""; }
-
-.fa-mars-stroke:before {
- content: ""; }
-
-.fa-mars-stroke-v:before {
- content: ""; }
-
-.fa-mars-stroke-h:before {
- content: ""; }
-
-.fa-neuter:before {
- content: ""; }
-
-.fa-genderless:before {
- content: ""; }
-
-.fa-facebook-official:before {
- content: ""; }
-
-.fa-pinterest-p:before {
- content: ""; }
-
-.fa-whatsapp:before {
- content: ""; }
-
-.fa-server:before {
- content: ""; }
-
-.fa-user-plus:before {
- content: ""; }
-
-.fa-user-times:before {
- content: ""; }
-
-.fa-hotel:before,
-.fa-bed:before {
- content: ""; }
-
-.fa-viacoin:before {
- content: ""; }
-
-.fa-train:before {
- content: ""; }
-
-.fa-subway:before {
- content: ""; }
-
-.fa-medium:before {
- content: ""; }
-
-.fa-yc:before,
-.fa-y-combinator:before {
- content: ""; }
-
-.fa-optin-monster:before {
- content: ""; }
-
-.fa-opencart:before {
- content: ""; }
-
-.fa-expeditedssl:before {
- content: ""; }
-
-.fa-battery-4:before,
-.fa-battery:before,
-.fa-battery-full:before {
- content: ""; }
-
-.fa-battery-3:before,
-.fa-battery-three-quarters:before {
- content: ""; }
-
-.fa-battery-2:before,
-.fa-battery-half:before {
- content: ""; }
-
-.fa-battery-1:before,
-.fa-battery-quarter:before {
- content: ""; }
-
-.fa-battery-0:before,
-.fa-battery-empty:before {
- content: ""; }
-
-.fa-mouse-pointer:before {
- content: ""; }
-
-.fa-i-cursor:before {
- content: ""; }
-
-.fa-object-group:before {
- content: ""; }
-
-.fa-object-ungroup:before {
- content: ""; }
-
-.fa-sticky-note:before {
- content: ""; }
-
-.fa-sticky-note-o:before {
- content: ""; }
-
-.fa-cc-jcb:before {
- content: ""; }
-
-.fa-cc-diners-club:before {
- content: ""; }
-
-.fa-clone:before {
- content: ""; }
-
-.fa-balance-scale:before {
- content: ""; }
-
-.fa-hourglass-o:before {
- content: ""; }
-
-.fa-hourglass-1:before,
-.fa-hourglass-start:before {
- content: ""; }
-
-.fa-hourglass-2:before,
-.fa-hourglass-half:before {
- content: ""; }
-
-.fa-hourglass-3:before,
-.fa-hourglass-end:before {
- content: ""; }
-
-.fa-hourglass:before {
- content: ""; }
-
-.fa-hand-grab-o:before,
-.fa-hand-rock-o:before {
- content: ""; }
-
-.fa-hand-stop-o:before,
-.fa-hand-paper-o:before {
- content: ""; }
-
-.fa-hand-scissors-o:before {
- content: ""; }
-
-.fa-hand-lizard-o:before {
- content: ""; }
-
-.fa-hand-spock-o:before {
- content: ""; }
-
-.fa-hand-pointer-o:before {
- content: ""; }
-
-.fa-hand-peace-o:before {
- content: ""; }
-
-.fa-trademark:before {
- content: ""; }
-
-.fa-registered:before {
- content: ""; }
-
-.fa-creative-commons:before {
- content: ""; }
-
-.fa-gg:before {
- content: ""; }
-
-.fa-gg-circle:before {
- content: ""; }
-
-.fa-tripadvisor:before {
- content: ""; }
-
-.fa-odnoklassniki:before {
- content: ""; }
-
-.fa-odnoklassniki-square:before {
- content: ""; }
-
-.fa-get-pocket:before {
- content: ""; }
-
-.fa-wikipedia-w:before {
- content: ""; }
-
-.fa-safari:before {
- content: ""; }
-
-.fa-chrome:before {
- content: ""; }
-
-.fa-firefox:before {
- content: ""; }
-
-.fa-opera:before {
- content: ""; }
-
-.fa-internet-explorer:before {
- content: ""; }
-
-.fa-tv:before,
-.fa-television:before {
- content: ""; }
-
-.fa-contao:before {
- content: ""; }
-
-.fa-500px:before {
- content: ""; }
-
-.fa-amazon:before {
- content: ""; }
-
-.fa-calendar-plus-o:before {
- content: ""; }
-
-.fa-calendar-minus-o:before {
- content: ""; }
-
-.fa-calendar-times-o:before {
- content: ""; }
-
-.fa-calendar-check-o:before {
- content: ""; }
-
-.fa-industry:before {
- content: ""; }
-
-.fa-map-pin:before {
- content: ""; }
-
-.fa-map-signs:before {
- content: ""; }
-
-.fa-map-o:before {
- content: ""; }
-
-.fa-map:before {
- content: ""; }
-
-.fa-commenting:before {
- content: ""; }
-
-.fa-commenting-o:before {
- content: ""; }
-
-.fa-houzz:before {
- content: ""; }
-
-.fa-vimeo:before {
- content: ""; }
-
-.fa-black-tie:before {
- content: ""; }
-
-.fa-fonticons:before {
- content: ""; }
-
-.fa-reddit-alien:before {
- content: ""; }
-
-.fa-edge:before {
- content: ""; }
-
-.fa-credit-card-alt:before {
- content: ""; }
-
-.fa-codiepie:before {
- content: ""; }
-
-.fa-modx:before {
- content: ""; }
-
-.fa-fort-awesome:before {
- content: ""; }
-
-.fa-usb:before {
- content: ""; }
-
-.fa-product-hunt:before {
- content: ""; }
-
-.fa-mixcloud:before {
- content: ""; }
-
-.fa-scribd:before {
- content: ""; }
-
-.fa-pause-circle:before {
- content: ""; }
-
-.fa-pause-circle-o:before {
- content: ""; }
-
-.fa-stop-circle:before {
- content: ""; }
-
-.fa-stop-circle-o:before {
- content: ""; }
-
-.fa-shopping-bag:before {
- content: ""; }
-
-.fa-shopping-basket:before {
- content: ""; }
-
-.fa-hashtag:before {
- content: ""; }
-
-.fa-bluetooth:before {
- content: ""; }
-
-.fa-bluetooth-b:before {
- content: ""; }
-
-.fa-percent:before {
- content: ""; }
-
-.fa-gitlab:before, .icon-gitlab:before {
- content: ""; }
-
-.fa-wpbeginner:before {
- content: ""; }
-
-.fa-wpforms:before {
- content: ""; }
-
-.fa-envira:before {
- content: ""; }
-
-.fa-universal-access:before {
- content: ""; }
-
-.fa-wheelchair-alt:before {
- content: ""; }
-
-.fa-question-circle-o:before {
- content: ""; }
-
-.fa-blind:before {
- content: ""; }
-
-.fa-audio-description:before {
- content: ""; }
-
-.fa-volume-control-phone:before {
- content: ""; }
-
-.fa-braille:before {
- content: ""; }
-
-.fa-assistive-listening-systems:before {
- content: ""; }
-
-.fa-asl-interpreting:before,
-.fa-american-sign-language-interpreting:before {
- content: ""; }
-
-.fa-deafness:before,
-.fa-hard-of-hearing:before,
-.fa-deaf:before {
- content: ""; }
-
-.fa-glide:before {
- content: ""; }
-
-.fa-glide-g:before {
- content: ""; }
-
-.fa-signing:before,
-.fa-sign-language:before {
- content: ""; }
-
-.fa-low-vision:before {
- content: ""; }
-
-.fa-viadeo:before {
- content: ""; }
-
-.fa-viadeo-square:before {
- content: ""; }
-
-.fa-snapchat:before {
- content: ""; }
-
-.fa-snapchat-ghost:before {
- content: ""; }
-
-.fa-snapchat-square:before {
- content: ""; }
-
-.fa-pied-piper:before {
- content: ""; }
-
-.fa-first-order:before {
- content: ""; }
-
-.fa-yoast:before {
- content: ""; }
-
-.fa-themeisle:before {
- content: ""; }
-
-.fa-google-plus-circle:before,
-.fa-google-plus-official:before {
- content: ""; }
-
-.fa-fa:before,
-.fa-font-awesome:before {
- content: ""; }
-
-.fa-handshake-o:before {
- content: ""; }
-
-.fa-envelope-open:before {
- content: ""; }
-
-.fa-envelope-open-o:before {
- content: ""; }
-
-.fa-linode:before {
- content: ""; }
-
-.fa-address-book:before {
- content: ""; }
-
-.fa-address-book-o:before {
- content: ""; }
-
-.fa-vcard:before,
-.fa-address-card:before {
- content: ""; }
-
-.fa-vcard-o:before,
-.fa-address-card-o:before {
- content: ""; }
-
-.fa-user-circle:before {
- content: ""; }
-
-.fa-user-circle-o:before {
- content: ""; }
-
-.fa-user-o:before {
- content: ""; }
-
-.fa-id-badge:before {
- content: ""; }
-
-.fa-drivers-license:before,
-.fa-id-card:before {
- content: ""; }
-
-.fa-drivers-license-o:before,
-.fa-id-card-o:before {
- content: ""; }
-
-.fa-quora:before {
- content: ""; }
-
-.fa-free-code-camp:before {
- content: ""; }
-
-.fa-telegram:before {
- content: ""; }
-
-.fa-thermometer-4:before,
-.fa-thermometer:before,
-.fa-thermometer-full:before {
- content: ""; }
-
-.fa-thermometer-3:before,
-.fa-thermometer-three-quarters:before {
- content: ""; }
-
-.fa-thermometer-2:before,
-.fa-thermometer-half:before {
- content: ""; }
-
-.fa-thermometer-1:before,
-.fa-thermometer-quarter:before {
- content: ""; }
-
-.fa-thermometer-0:before,
-.fa-thermometer-empty:before {
- content: ""; }
-
-.fa-shower:before {
- content: ""; }
-
-.fa-bathtub:before,
-.fa-s15:before,
-.fa-bath:before {
- content: ""; }
-
-.fa-podcast:before {
- content: ""; }
-
-.fa-window-maximize:before {
- content: ""; }
-
-.fa-window-minimize:before {
- content: ""; }
-
-.fa-window-restore:before {
- content: ""; }
-
-.fa-times-rectangle:before,
-.fa-window-close:before {
- content: ""; }
-
-.fa-times-rectangle-o:before,
-.fa-window-close-o:before {
- content: ""; }
-
-.fa-bandcamp:before {
- content: ""; }
-
-.fa-grav:before {
- content: ""; }
-
-.fa-etsy:before {
- content: ""; }
-
-.fa-imdb:before {
- content: ""; }
-
-.fa-ravelry:before {
- content: ""; }
-
-.fa-eercast:before {
- content: ""; }
-
-.fa-microchip:before {
- content: ""; }
-
-.fa-snowflake-o:before {
- content: ""; }
-
-.fa-superpowers:before {
- content: ""; }
-
-.fa-wpexplorer:before {
- content: ""; }
-
-.fa-meetup:before {
- content: ""; }
-
-.sr-only {
- position: absolute;
- width: 1px;
- height: 1px;
- padding: 0;
- margin: -1px;
- overflow: hidden;
- clip: rect(0, 0, 0, 0);
- border: 0; }
-
-.sr-only-focusable:active, .sr-only-focusable:focus {
- position: static;
- width: auto;
- height: auto;
- margin: 0;
- overflow: visible;
- clip: auto; }
-
-.fa, .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand, .rst-content .admonition-title, .rst-content h1 .headerlink, .rst-content h2 .headerlink, .rst-content h3 .headerlink, .rst-content h4 .headerlink, .rst-content h5 .headerlink, .rst-content h6 .headerlink, .rst-content dl dt .headerlink, .rst-content p.caption .headerlink, .rst-content tt.download span:first-child, .rst-content code.download span:first-child, .icon, .wy-dropdown .caret, .wy-inline-validate.wy-inline-validate-success .wy-input-context, .wy-inline-validate.wy-inline-validate-danger .wy-input-context, .wy-inline-validate.wy-inline-validate-warning .wy-input-context, .wy-inline-validate.wy-inline-validate-info .wy-input-context {
- font-family: inherit; }
-
-.fa:before, .wy-menu-vertical li span.toctree-expand:before, .wy-menu-vertical li.on a span.toctree-expand:before, .wy-menu-vertical li.current > a span.toctree-expand:before, .rst-content .admonition-title:before, .rst-content h1 .headerlink:before, .rst-content h2 .headerlink:before, .rst-content h3 .headerlink:before, .rst-content h4 .headerlink:before, .rst-content h5 .headerlink:before, .rst-content h6 .headerlink:before, .rst-content dl dt .headerlink:before, .rst-content p.caption .headerlink:before, .rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before, .icon:before, .wy-dropdown .caret:before, .wy-inline-validate.wy-inline-validate-success .wy-input-context:before, .wy-inline-validate.wy-inline-validate-danger .wy-input-context:before, .wy-inline-validate.wy-inline-validate-warning .wy-input-context:before, .wy-inline-validate.wy-inline-validate-info .wy-input-context:before {
- font-family: "FontAwesome";
- display: inline-block;
- font-style: normal;
- font-weight: normal;
- line-height: 1;
- text-decoration: inherit; }
-
-a .fa, a .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li a span.toctree-expand, .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand, a .rst-content .admonition-title, .rst-content a .admonition-title, a .rst-content h1 .headerlink, .rst-content h1 a .headerlink, a .rst-content h2 .headerlink, .rst-content h2 a .headerlink, a .rst-content h3 .headerlink, .rst-content h3 a .headerlink, a .rst-content h4 .headerlink, .rst-content h4 a .headerlink, a .rst-content h5 .headerlink, .rst-content h5 a .headerlink, a .rst-content h6 .headerlink, .rst-content h6 a .headerlink, a .rst-content dl dt .headerlink, .rst-content dl dt a .headerlink, a .rst-content p.caption .headerlink, .rst-content p.caption a .headerlink, a .rst-content tt.download span:first-child, .rst-content tt.download a span:first-child, a .rst-content code.download span:first-child, .rst-content code.download a span:first-child, a .icon {
- display: inline-block;
- text-decoration: inherit; }
-
-.btn .fa, .btn .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li .btn span.toctree-expand, .btn .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.on a .btn span.toctree-expand, .btn .wy-menu-vertical li.current > a span.toctree-expand, .wy-menu-vertical li.current > a .btn span.toctree-expand, .btn .rst-content .admonition-title, .rst-content .btn .admonition-title, .btn .rst-content h1 .headerlink, .rst-content h1 .btn .headerlink, .btn .rst-content h2 .headerlink, .rst-content h2 .btn .headerlink, .btn .rst-content h3 .headerlink, .rst-content h3 .btn .headerlink, .btn .rst-content h4 .headerlink, .rst-content h4 .btn .headerlink, .btn .rst-content h5 .headerlink, .rst-content h5 .btn .headerlink, .btn .rst-content h6 .headerlink, .rst-content h6 .btn .headerlink, .btn .rst-content dl dt .headerlink, .rst-content dl dt .btn .headerlink, .btn .rst-content p.caption .headerlink, .rst-content p.caption .btn .headerlink, .btn .rst-content tt.download span:first-child, .rst-content tt.download .btn span:first-child, .btn .rst-content code.download span:first-child, .rst-content code.download .btn span:first-child, .btn .icon, .nav .fa, .nav .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li .nav span.toctree-expand, .nav .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.on a .nav span.toctree-expand, .nav .wy-menu-vertical li.current > a span.toctree-expand, .wy-menu-vertical li.current > a .nav span.toctree-expand, .nav .rst-content .admonition-title, .rst-content .nav .admonition-title, .nav .rst-content h1 .headerlink, .rst-content h1 .nav .headerlink, .nav .rst-content h2 .headerlink, .rst-content h2 .nav .headerlink, .nav .rst-content h3 .headerlink, .rst-content h3 .nav .headerlink, .nav .rst-content h4 .headerlink, .rst-content h4 .nav .headerlink, .nav .rst-content h5 .headerlink, .rst-content h5 .nav .headerlink, .nav .rst-content h6 .headerlink, .rst-content h6 .nav .headerlink, .nav .rst-content dl dt .headerlink, .rst-content dl dt .nav .headerlink, .nav .rst-content p.caption .headerlink, .rst-content p.caption .nav .headerlink, .nav .rst-content tt.download span:first-child, .rst-content tt.download .nav span:first-child, .nav .rst-content code.download span:first-child, .rst-content code.download .nav span:first-child, .nav .icon {
- display: inline; }
-
-.btn .fa.fa-large, .btn .wy-menu-vertical li span.fa-large.toctree-expand, .wy-menu-vertical li .btn span.fa-large.toctree-expand, .btn .rst-content .fa-large.admonition-title, .rst-content .btn .fa-large.admonition-title, .btn .rst-content h1 .fa-large.headerlink, .rst-content h1 .btn .fa-large.headerlink, .btn .rst-content h2 .fa-large.headerlink, .rst-content h2 .btn .fa-large.headerlink, .btn .rst-content h3 .fa-large.headerlink, .rst-content h3 .btn .fa-large.headerlink, .btn .rst-content h4 .fa-large.headerlink, .rst-content h4 .btn .fa-large.headerlink, .btn .rst-content h5 .fa-large.headerlink, .rst-content h5 .btn .fa-large.headerlink, .btn .rst-content h6 .fa-large.headerlink, .rst-content h6 .btn .fa-large.headerlink, .btn .rst-content dl dt .fa-large.headerlink, .rst-content dl dt .btn .fa-large.headerlink, .btn .rst-content p.caption .fa-large.headerlink, .rst-content p.caption .btn .fa-large.headerlink, .btn .rst-content tt.download span.fa-large:first-child, .rst-content tt.download .btn span.fa-large:first-child, .btn .rst-content code.download span.fa-large:first-child, .rst-content code.download .btn span.fa-large:first-child, .btn .fa-large.icon, .nav .fa.fa-large, .nav .wy-menu-vertical li span.fa-large.toctree-expand, .wy-menu-vertical li .nav span.fa-large.toctree-expand, .nav .rst-content .fa-large.admonition-title, .rst-content .nav .fa-large.admonition-title, .nav .rst-content h1 .fa-large.headerlink, .rst-content h1 .nav .fa-large.headerlink, .nav .rst-content h2 .fa-large.headerlink, .rst-content h2 .nav .fa-large.headerlink, .nav .rst-content h3 .fa-large.headerlink, .rst-content h3 .nav .fa-large.headerlink, .nav .rst-content h4 .fa-large.headerlink, .rst-content h4 .nav .fa-large.headerlink, .nav .rst-content h5 .fa-large.headerlink, .rst-content h5 .nav .fa-large.headerlink, .nav .rst-content h6 .fa-large.headerlink, .rst-content h6 .nav .fa-large.headerlink, .nav .rst-content dl dt .fa-large.headerlink, .rst-content dl dt .nav .fa-large.headerlink, .nav .rst-content p.caption .fa-large.headerlink, .rst-content p.caption .nav .fa-large.headerlink, .nav .rst-content tt.download span.fa-large:first-child, .rst-content tt.download .nav span.fa-large:first-child, .nav .rst-content code.download span.fa-large:first-child, .rst-content code.download .nav span.fa-large:first-child, .nav .fa-large.icon {
- line-height: 0.9em; }
-
-.btn .fa.fa-spin, .btn .wy-menu-vertical li span.fa-spin.toctree-expand, .wy-menu-vertical li .btn span.fa-spin.toctree-expand, .btn .rst-content .fa-spin.admonition-title, .rst-content .btn .fa-spin.admonition-title, .btn .rst-content h1 .fa-spin.headerlink, .rst-content h1 .btn .fa-spin.headerlink, .btn .rst-content h2 .fa-spin.headerlink, .rst-content h2 .btn .fa-spin.headerlink, .btn .rst-content h3 .fa-spin.headerlink, .rst-content h3 .btn .fa-spin.headerlink, .btn .rst-content h4 .fa-spin.headerlink, .rst-content h4 .btn .fa-spin.headerlink, .btn .rst-content h5 .fa-spin.headerlink, .rst-content h5 .btn .fa-spin.headerlink, .btn .rst-content h6 .fa-spin.headerlink, .rst-content h6 .btn .fa-spin.headerlink, .btn .rst-content dl dt .fa-spin.headerlink, .rst-content dl dt .btn .fa-spin.headerlink, .btn .rst-content p.caption .fa-spin.headerlink, .rst-content p.caption .btn .fa-spin.headerlink, .btn .rst-content tt.download span.fa-spin:first-child, .rst-content tt.download .btn span.fa-spin:first-child, .btn .rst-content code.download span.fa-spin:first-child, .rst-content code.download .btn span.fa-spin:first-child, .btn .fa-spin.icon, .nav .fa.fa-spin, .nav .wy-menu-vertical li span.fa-spin.toctree-expand, .wy-menu-vertical li .nav span.fa-spin.toctree-expand, .nav .rst-content .fa-spin.admonition-title, .rst-content .nav .fa-spin.admonition-title, .nav .rst-content h1 .fa-spin.headerlink, .rst-content h1 .nav .fa-spin.headerlink, .nav .rst-content h2 .fa-spin.headerlink, .rst-content h2 .nav .fa-spin.headerlink, .nav .rst-content h3 .fa-spin.headerlink, .rst-content h3 .nav .fa-spin.headerlink, .nav .rst-content h4 .fa-spin.headerlink, .rst-content h4 .nav .fa-spin.headerlink, .nav .rst-content h5 .fa-spin.headerlink, .rst-content h5 .nav .fa-spin.headerlink, .nav .rst-content h6 .fa-spin.headerlink, .rst-content h6 .nav .fa-spin.headerlink, .nav .rst-content dl dt .fa-spin.headerlink, .rst-content dl dt .nav .fa-spin.headerlink, .nav .rst-content p.caption .fa-spin.headerlink, .rst-content p.caption .nav .fa-spin.headerlink, .nav .rst-content tt.download span.fa-spin:first-child, .rst-content tt.download .nav span.fa-spin:first-child, .nav .rst-content code.download span.fa-spin:first-child, .rst-content code.download .nav span.fa-spin:first-child, .nav .fa-spin.icon {
- display: inline-block; }
-
-.btn.fa:before, .wy-menu-vertical li span.btn.toctree-expand:before, .rst-content .btn.admonition-title:before, .rst-content h1 .btn.headerlink:before, .rst-content h2 .btn.headerlink:before, .rst-content h3 .btn.headerlink:before, .rst-content h4 .btn.headerlink:before, .rst-content h5 .btn.headerlink:before, .rst-content h6 .btn.headerlink:before, .rst-content dl dt .btn.headerlink:before, .rst-content p.caption .btn.headerlink:before, .rst-content tt.download span.btn:first-child:before, .rst-content code.download span.btn:first-child:before, .btn.icon:before {
- opacity: 0.5;
- -webkit-transition: opacity 0.05s ease-in;
- -moz-transition: opacity 0.05s ease-in;
- transition: opacity 0.05s ease-in; }
-
-.btn.fa:hover:before, .wy-menu-vertical li span.btn.toctree-expand:hover:before, .rst-content .btn.admonition-title:hover:before, .rst-content h1 .btn.headerlink:hover:before, .rst-content h2 .btn.headerlink:hover:before, .rst-content h3 .btn.headerlink:hover:before, .rst-content h4 .btn.headerlink:hover:before, .rst-content h5 .btn.headerlink:hover:before, .rst-content h6 .btn.headerlink:hover:before, .rst-content dl dt .btn.headerlink:hover:before, .rst-content p.caption .btn.headerlink:hover:before, .rst-content tt.download span.btn:first-child:hover:before, .rst-content code.download span.btn:first-child:hover:before, .btn.icon:hover:before {
- opacity: 1; }
-
-.btn-mini .fa:before, .btn-mini .wy-menu-vertical li span.toctree-expand:before, .wy-menu-vertical li .btn-mini span.toctree-expand:before, .btn-mini .rst-content .admonition-title:before, .rst-content .btn-mini .admonition-title:before, .btn-mini .rst-content h1 .headerlink:before, .rst-content h1 .btn-mini .headerlink:before, .btn-mini .rst-content h2 .headerlink:before, .rst-content h2 .btn-mini .headerlink:before, .btn-mini .rst-content h3 .headerlink:before, .rst-content h3 .btn-mini .headerlink:before, .btn-mini .rst-content h4 .headerlink:before, .rst-content h4 .btn-mini .headerlink:before, .btn-mini .rst-content h5 .headerlink:before, .rst-content h5 .btn-mini .headerlink:before, .btn-mini .rst-content h6 .headerlink:before, .rst-content h6 .btn-mini .headerlink:before, .btn-mini .rst-content dl dt .headerlink:before, .rst-content dl dt .btn-mini .headerlink:before, .btn-mini .rst-content p.caption .headerlink:before, .rst-content p.caption .btn-mini .headerlink:before, .btn-mini .rst-content tt.download span:first-child:before, .rst-content tt.download .btn-mini span:first-child:before, .btn-mini .rst-content code.download span:first-child:before, .rst-content code.download .btn-mini span:first-child:before, .btn-mini .icon:before {
- font-size: 14px;
- vertical-align: -15%; }
-
-.wy-alert, .rst-content .note, .rst-content .attention, .rst-content .caution, .rst-content .danger, .rst-content .error, .rst-content .hint, .rst-content .important, .rst-content .tip, .rst-content .warning, .rst-content .seealso, .rst-content .admonition-todo, .rst-content .admonition {
- padding: 12px;
- line-height: 24px;
- margin-bottom: 24px;
- background: #e7f2fa;
- border-radius: 5px;
- overflow: hidden; }
-
-.wy-alert-title, .rst-content .admonition-title {
- color: #fff;
- font-weight: 600;
- display: block;
- color: #fff;
- background: #6ab0de;
- margin: -12px;
- padding: 6px 12px;
- margin-bottom: 12px; }
-
-.wy-alert.wy-alert-danger, .rst-content .wy-alert-danger.note, .rst-content .wy-alert-danger.attention, .rst-content .wy-alert-danger.caution, .rst-content .danger, .rst-content .error, .rst-content .wy-alert-danger.hint, .rst-content .wy-alert-danger.important, .rst-content .wy-alert-danger.tip, .rst-content .wy-alert-danger.warning, .rst-content .wy-alert-danger.seealso, .rst-content .wy-alert-danger.admonition-todo, .rst-content .wy-alert-danger.admonition {
- background: #fdf3f2; }
-
-.wy-alert.wy-alert-danger .wy-alert-title, .rst-content .wy-alert-danger.note .wy-alert-title, .rst-content .wy-alert-danger.attention .wy-alert-title, .rst-content .wy-alert-danger.caution .wy-alert-title, .rst-content .danger .wy-alert-title, .rst-content .error .wy-alert-title, .rst-content .wy-alert-danger.hint .wy-alert-title, .rst-content .wy-alert-danger.important .wy-alert-title, .rst-content .wy-alert-danger.tip .wy-alert-title, .rst-content .wy-alert-danger.warning .wy-alert-title, .rst-content .wy-alert-danger.seealso .wy-alert-title, .rst-content .wy-alert-danger.admonition-todo .wy-alert-title, .rst-content .wy-alert-danger.admonition .wy-alert-title, .wy-alert.wy-alert-danger .rst-content .admonition-title, .rst-content .wy-alert.wy-alert-danger .admonition-title, .rst-content .wy-alert-danger.note .admonition-title, .rst-content .wy-alert-danger.attention .admonition-title, .rst-content .wy-alert-danger.caution .admonition-title, .rst-content .danger .admonition-title, .rst-content .error .admonition-title, .rst-content .wy-alert-danger.hint .admonition-title, .rst-content .wy-alert-danger.important .admonition-title, .rst-content .wy-alert-danger.tip .admonition-title, .rst-content .wy-alert-danger.warning .admonition-title, .rst-content .wy-alert-danger.seealso .admonition-title, .rst-content .wy-alert-danger.admonition-todo .admonition-title, .rst-content .wy-alert-danger.admonition .admonition-title {
- background: #f29f97; }
-
-.wy-alert.wy-alert-warning, .rst-content .wy-alert-warning.note, .rst-content .attention, .rst-content .caution, .rst-content .wy-alert-warning.danger, .rst-content .wy-alert-warning.error, .rst-content .wy-alert-warning.hint, .rst-content .wy-alert-warning.important, .rst-content .wy-alert-warning.tip, .rst-content .warning, .rst-content .wy-alert-warning.seealso, .rst-content .admonition-todo, .rst-content .wy-alert-warning.admonition {
- background: #ffedcc; }
-
-.wy-alert.wy-alert-warning .wy-alert-title, .rst-content .wy-alert-warning.note .wy-alert-title, .rst-content .attention .wy-alert-title, .rst-content .caution .wy-alert-title, .rst-content .wy-alert-warning.danger .wy-alert-title, .rst-content .wy-alert-warning.error .wy-alert-title, .rst-content .wy-alert-warning.hint .wy-alert-title, .rst-content .wy-alert-warning.important .wy-alert-title, .rst-content .wy-alert-warning.tip .wy-alert-title, .rst-content .warning .wy-alert-title, .rst-content .wy-alert-warning.seealso .wy-alert-title, .rst-content .admonition-todo .wy-alert-title, .rst-content .wy-alert-warning.admonition .wy-alert-title, .wy-alert.wy-alert-warning .rst-content .admonition-title, .rst-content .wy-alert.wy-alert-warning .admonition-title, .rst-content .wy-alert-warning.note .admonition-title, .rst-content .attention .admonition-title, .rst-content .caution .admonition-title, .rst-content .wy-alert-warning.danger .admonition-title, .rst-content .wy-alert-warning.error .admonition-title, .rst-content .wy-alert-warning.hint .admonition-title, .rst-content .wy-alert-warning.important .admonition-title, .rst-content .wy-alert-warning.tip .admonition-title, .rst-content .warning .admonition-title, .rst-content .wy-alert-warning.seealso .admonition-title, .rst-content .admonition-todo .admonition-title, .rst-content .wy-alert-warning.admonition .admonition-title {
- background: #f0b37e; }
-
-.wy-alert.wy-alert-info, .rst-content .note, .rst-content .wy-alert-info.attention, .rst-content .wy-alert-info.caution, .rst-content .wy-alert-info.danger, .rst-content .wy-alert-info.error, .rst-content .wy-alert-info.hint, .rst-content .wy-alert-info.important, .rst-content .wy-alert-info.tip, .rst-content .wy-alert-info.warning, .rst-content .seealso, .rst-content .wy-alert-info.admonition-todo, .rst-content .wy-alert-info.admonition {
- background: #e7f2fa; }
-
-.wy-alert.wy-alert-info .wy-alert-title, .rst-content .note .wy-alert-title, .rst-content .wy-alert-info.attention .wy-alert-title, .rst-content .wy-alert-info.caution .wy-alert-title, .rst-content .wy-alert-info.danger .wy-alert-title, .rst-content .wy-alert-info.error .wy-alert-title, .rst-content .wy-alert-info.hint .wy-alert-title, .rst-content .wy-alert-info.important .wy-alert-title, .rst-content .wy-alert-info.tip .wy-alert-title, .rst-content .wy-alert-info.warning .wy-alert-title, .rst-content .seealso .wy-alert-title, .rst-content .wy-alert-info.admonition-todo .wy-alert-title, .rst-content .wy-alert-info.admonition .wy-alert-title, .wy-alert.wy-alert-info .rst-content .admonition-title, .rst-content .wy-alert.wy-alert-info .admonition-title, .rst-content .note .admonition-title, .rst-content .wy-alert-info.attention .admonition-title, .rst-content .wy-alert-info.caution .admonition-title, .rst-content .wy-alert-info.danger .admonition-title, .rst-content .wy-alert-info.error .admonition-title, .rst-content .wy-alert-info.hint .admonition-title, .rst-content .wy-alert-info.important .admonition-title, .rst-content .wy-alert-info.tip .admonition-title, .rst-content .wy-alert-info.warning .admonition-title, .rst-content .seealso .admonition-title, .rst-content .wy-alert-info.admonition-todo .admonition-title, .rst-content .wy-alert-info.admonition .admonition-title {
- background: #6ab0de; }
-
-.wy-alert.wy-alert-success, .rst-content .wy-alert-success.note, .rst-content .wy-alert-success.attention, .rst-content .wy-alert-success.caution, .rst-content .wy-alert-success.danger, .rst-content .wy-alert-success.error, .rst-content .hint, .rst-content .important, .rst-content .tip, .rst-content .wy-alert-success.warning, .rst-content .wy-alert-success.seealso, .rst-content .wy-alert-success.admonition-todo, .rst-content .wy-alert-success.admonition {
- background: #dbfaf4; }
-
-.wy-alert.wy-alert-success .wy-alert-title, .rst-content .wy-alert-success.note .wy-alert-title, .rst-content .wy-alert-success.attention .wy-alert-title, .rst-content .wy-alert-success.caution .wy-alert-title, .rst-content .wy-alert-success.danger .wy-alert-title, .rst-content .wy-alert-success.error .wy-alert-title, .rst-content .hint .wy-alert-title, .rst-content .important .wy-alert-title, .rst-content .tip .wy-alert-title, .rst-content .wy-alert-success.warning .wy-alert-title, .rst-content .wy-alert-success.seealso .wy-alert-title, .rst-content .wy-alert-success.admonition-todo .wy-alert-title, .rst-content .wy-alert-success.admonition .wy-alert-title, .wy-alert.wy-alert-success .rst-content .admonition-title, .rst-content .wy-alert.wy-alert-success .admonition-title, .rst-content .wy-alert-success.note .admonition-title, .rst-content .wy-alert-success.attention .admonition-title, .rst-content .wy-alert-success.caution .admonition-title, .rst-content .wy-alert-success.danger .admonition-title, .rst-content .wy-alert-success.error .admonition-title, .rst-content .hint .admonition-title, .rst-content .important .admonition-title, .rst-content .tip .admonition-title, .rst-content .wy-alert-success.warning .admonition-title, .rst-content .wy-alert-success.seealso .admonition-title, .rst-content .wy-alert-success.admonition-todo .admonition-title, .rst-content .wy-alert-success.admonition .admonition-title {
- background: #1abc9c; }
-
-.wy-alert.wy-alert-neutral, .rst-content .wy-alert-neutral.note, .rst-content .wy-alert-neutral.attention, .rst-content .wy-alert-neutral.caution, .rst-content .wy-alert-neutral.danger, .rst-content .wy-alert-neutral.error, .rst-content .wy-alert-neutral.hint, .rst-content .wy-alert-neutral.important, .rst-content .wy-alert-neutral.tip, .rst-content .wy-alert-neutral.warning, .rst-content .wy-alert-neutral.seealso, .rst-content .wy-alert-neutral.admonition-todo, .rst-content .wy-alert-neutral.admonition {
- background: #f3f6f6; }
-
-.wy-alert.wy-alert-neutral .wy-alert-title, .rst-content .wy-alert-neutral.note .wy-alert-title, .rst-content .wy-alert-neutral.attention .wy-alert-title, .rst-content .wy-alert-neutral.caution .wy-alert-title, .rst-content .wy-alert-neutral.danger .wy-alert-title, .rst-content .wy-alert-neutral.error .wy-alert-title, .rst-content .wy-alert-neutral.hint .wy-alert-title, .rst-content .wy-alert-neutral.important .wy-alert-title, .rst-content .wy-alert-neutral.tip .wy-alert-title, .rst-content .wy-alert-neutral.warning .wy-alert-title, .rst-content .wy-alert-neutral.seealso .wy-alert-title, .rst-content .wy-alert-neutral.admonition-todo .wy-alert-title, .rst-content .wy-alert-neutral.admonition .wy-alert-title, .wy-alert.wy-alert-neutral .rst-content .admonition-title, .rst-content .wy-alert.wy-alert-neutral .admonition-title, .rst-content .wy-alert-neutral.note .admonition-title, .rst-content .wy-alert-neutral.attention .admonition-title, .rst-content .wy-alert-neutral.caution .admonition-title, .rst-content .wy-alert-neutral.danger .admonition-title, .rst-content .wy-alert-neutral.error .admonition-title, .rst-content .wy-alert-neutral.hint .admonition-title, .rst-content .wy-alert-neutral.important .admonition-title, .rst-content .wy-alert-neutral.tip .admonition-title, .rst-content .wy-alert-neutral.warning .admonition-title, .rst-content .wy-alert-neutral.seealso .admonition-title, .rst-content .wy-alert-neutral.admonition-todo .admonition-title, .rst-content .wy-alert-neutral.admonition .admonition-title {
- color: #404040;
- background: #e1e4e5; }
-
-.wy-alert.wy-alert-neutral a, .rst-content .wy-alert-neutral.note a, .rst-content .wy-alert-neutral.attention a, .rst-content .wy-alert-neutral.caution a, .rst-content .wy-alert-neutral.danger a, .rst-content .wy-alert-neutral.error a, .rst-content .wy-alert-neutral.hint a, .rst-content .wy-alert-neutral.important a, .rst-content .wy-alert-neutral.tip a, .rst-content .wy-alert-neutral.warning a, .rst-content .wy-alert-neutral.seealso a, .rst-content .wy-alert-neutral.admonition-todo a, .rst-content .wy-alert-neutral.admonition a {
- color: #2980B9; }
-
-.wy-alert p:last-child, .rst-content .note p:last-child, .rst-content .attention p:last-child, .rst-content .caution p:last-child, .rst-content .danger p:last-child, .rst-content .error p:last-child, .rst-content .hint p:last-child, .rst-content .important p:last-child, .rst-content .tip p:last-child, .rst-content .warning p:last-child, .rst-content .seealso p:last-child, .rst-content .admonition-todo p:last-child, .rst-content .admonition p:last-child {
- margin-bottom: 0; }
-
-.wy-tray-container {
- position: fixed;
- bottom: 0px;
- left: 0;
- z-index: 600; }
-
-.wy-tray-container li {
- display: block;
- width: 300px;
- background: transparent;
- color: #fff;
- text-align: center;
- box-shadow: 0 5px 5px 0 rgba(0, 0, 0, 0.1);
- padding: 0 24px;
- min-width: 20%;
- opacity: 0;
- height: 0;
- line-height: 56px;
- overflow: hidden;
- -webkit-transition: all 0.3s ease-in;
- -moz-transition: all 0.3s ease-in;
- transition: all 0.3s ease-in; }
-
-.wy-tray-container li.wy-tray-item-success {
- background: #27AE60; }
-
-.wy-tray-container li.wy-tray-item-info {
- background: #2980B9; }
-
-.wy-tray-container li.wy-tray-item-warning {
- background: #E67E22; }
-
-.wy-tray-container li.wy-tray-item-danger {
- background: #E74C3C; }
-
-.wy-tray-container li.on {
- opacity: 1;
- height: 56px; }
-
-@media screen and (max-width: 768px) {
- .wy-tray-container {
- bottom: auto;
- top: 0;
- width: 100%; }
- .wy-tray-container li {
- width: 100%; } }
-
-button {
- font-size: 100%;
- margin: 0;
- vertical-align: baseline;
- *vertical-align: middle;
- cursor: pointer;
- line-height: normal;
- -webkit-appearance: button;
- *overflow: visible; }
-
-button::-moz-focus-inner, input::-moz-focus-inner {
- border: 0;
- padding: 0; }
-
-button[disabled] {
- cursor: default; }
-
-.btn {
- /* Structure */
- display: inline-flex;
- align-items: center;
- border-radius: 2px;
- line-height: normal;
- white-space: nowrap;
- text-align: center;
- cursor: pointer;
- font-size: 100%;
- padding: 6px 12px 6px 12px;
- color: #fff;
- background-color: #27AE60;
- text-decoration: none;
- border-radius: 5px;
- font-weight: normal;
- outline-none: false;
- vertical-align: middle;
- *display: inline;
- zoom: 1;
- -webkit-user-drag: none;
- -webkit-user-select: none;
- -moz-user-select: none;
- -ms-user-select: none;
- user-select: none;
- -webkit-transition: all 0.1s linear;
- -moz-transition: all 0.1s linear;
- transition: all 0.1s linear; }
-
-.btn-hover {
- background: #2e8ece;
- color: #fff; }
-
-.btn:hover {
- background: #2cc36b;
- color: #fff; }
-
-.btn:focus {
- background: #2cc36b;
- outline: 0; }
-
-.btn:active {
- box-shadow: 0px -1px 0px 0px rgba(0, 0, 0, 0.05) inset, 0px 2px 0px 0px rgba(0, 0, 0, 0.1) inset; }
-
-.btn:visited {
- color: #fff; }
-
-.btn:disabled {
- background-image: none;
- filter: progid:DXImageTransform.Microsoft.gradient(enabled = false);
- filter: alpha(opacity=40);
- opacity: 0.4;
- cursor: not-allowed;
- box-shadow: none; }
-
-.btn-disabled {
- background-image: none;
- filter: progid:DXImageTransform.Microsoft.gradient(enabled = false);
- filter: alpha(opacity=40);
- opacity: 0.4;
- cursor: not-allowed;
- box-shadow: none; }
-
-.btn-disabled:hover, .btn-disabled:focus, .btn-disabled:active {
- background-image: none;
- filter: progid:DXImageTransform.Microsoft.gradient(enabled = false);
- filter: alpha(opacity=40);
- opacity: 0.4;
- cursor: not-allowed;
- box-shadow: none; }
-
-.btn::-moz-focus-inner {
- padding: 0;
- border: 0; }
-
-.btn-small {
- font-size: 80%; }
-
-.btn-info {
- background-color: #2980B9 !important; }
-
-.btn-info:hover {
- background-color: #2e8ece !important; }
-
-.btn-neutral {
- background-color: #f5f7f9 !important;
- color: #039bee !important; }
-
-.btn-neutral:hover {
- background-color: #057eb6 !important;
- color: #fff !important; }
-
-.btn-success {
- background-color: #27AE60 !important; }
-
-.btn-success:hover {
- background-color: #229955 !important; }
-
-.btn-danger {
- background-color: #E74C3C !important; }
-
-.btn-danger:hover {
- background-color: #ea6153 !important; }
-
-.btn-warning {
- background-color: #E67E22 !important; }
-
-.btn-warning:hover {
- background-color: #e98b39 !important; }
-
-.btn-invert {
- background-color: #222; }
-
-.btn-invert:hover {
- background-color: #2f2f2f !important; }
-
-.btn-link {
- background-color: transparent !important;
- color: #2980B9;
- box-shadow: none;
- border-color: transparent !important; }
-
-.btn-link:hover {
- background-color: transparent !important;
- color: #409ad5 !important;
- box-shadow: none; }
-
-.btn-link:active {
- background-color: transparent !important;
- color: #409ad5 !important;
- box-shadow: none; }
-
-.btn-link:visited {
- color: #9B59B6; }
-
-.wy-btn-group .btn, .wy-control .btn {
- vertical-align: middle; }
-
-.wy-btn-group {
- margin-bottom: 24px;
- *zoom: 1; }
-
-.wy-btn-group:before, .wy-btn-group:after {
- display: table;
- content: ""; }
-
-.wy-btn-group:after {
- clear: both; }
-
-.wy-dropdown {
- position: relative;
- display: inline-block; }
-
-.wy-dropdown-active .wy-dropdown-menu {
- display: block; }
-
-.wy-dropdown-menu {
- position: absolute;
- left: 0;
- display: none;
- float: left;
- top: 100%;
- min-width: 100%;
- background: #fcfcfc;
- z-index: 100;
- border: solid 1px #cfd7dd;
- box-shadow: 0 2px 2px 0 rgba(0, 0, 0, 0.1);
- padding: 12px; }
-
-.wy-dropdown-menu > dd > a {
- display: block;
- clear: both;
- color: #404040;
- white-space: nowrap;
- font-size: 90%;
- padding: 0 12px;
- cursor: pointer; }
-
-.wy-dropdown-menu > dd > a:hover {
- background: #2980B9;
- color: #fff; }
-
-.wy-dropdown-menu > dd.divider {
- border-top: solid 1px #cfd7dd;
- margin: 6px 0; }
-
-.wy-dropdown-menu > dd.search {
- padding-bottom: 12px; }
-
-.wy-dropdown-menu > dd.search input[type="search"] {
- width: 100%; }
-
-.wy-dropdown-menu > dd.call-to-action {
- background: #e3e3e3;
- text-transform: uppercase;
- font-weight: 500;
- font-size: 80%; }
-
-.wy-dropdown-menu > dd.call-to-action:hover {
- background: #e3e3e3; }
-
-.wy-dropdown-menu > dd.call-to-action .btn {
- color: #fff; }
-
-.wy-dropdown.wy-dropdown-up .wy-dropdown-menu {
- bottom: 100%;
- top: auto;
- left: auto;
- right: 0; }
-
-.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu {
- background: #fcfcfc;
- margin-top: 2px; }
-
-.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a {
- padding: 6px 12px; }
-
-.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover {
- background: #2980B9;
- color: #fff; }
-
-.wy-dropdown.wy-dropdown-left .wy-dropdown-menu {
- right: 0;
- left: auto;
- text-align: right; }
-
-.wy-dropdown-arrow:before {
- content: " ";
- border-bottom: 5px solid whitesmoke;
- border-left: 5px solid transparent;
- border-right: 5px solid transparent;
- position: absolute;
- display: block;
- top: -4px;
- left: 50%;
- margin-left: -3px; }
-
-.wy-dropdown-arrow.wy-dropdown-arrow-left:before {
- left: 11px; }
-
-.wy-form-stacked select {
- display: block;
- padding-top: 30px;}
-
-.wy-form-aligned input, .wy-form-aligned textarea, .wy-form-aligned select, .wy-form-aligned .wy-help-inline, .wy-form-aligned label {
- display: inline-block;
- *display: inline;
- *zoom: 1;
- vertical-align: middle; }
-
-.wy-form-aligned .wy-control-group > label {
- display: inline-block;
- vertical-align: middle;
- width: 10em;
- margin: 6px 12px 0 0;
- float: left; }
-
-.wy-form-aligned .wy-control {
- float: left; }
-
-.wy-form-aligned .wy-control label {
- display: block; }
-
-.wy-form-aligned .wy-control select {
- margin-top: 6px; }
-
-fieldset {
- border: 0;
- margin: 0;
- padding: 0; }
-
-legend {
- display: block;
- width: 100%;
- border: 0;
- padding: 0;
- white-space: normal;
- margin-bottom: 24px;
- font-size: 150%;
- *margin-left: -7px; }
-
-label {
- display: block;
- margin: 0 0 0.3125em 0;
- color: #333;
- font-size: 90%; }
-
-input, select, textarea {
- font-size: 100%;
- margin: 0;
- vertical-align: baseline;
- *vertical-align: middle; }
-
-.wy-control-group {
- margin-bottom: 24px;
- *zoom: 1;
- max-width: 68em;
- margin-left: auto;
- margin-right: auto;
- *zoom: 1; }
-
-.wy-control-group:before, .wy-control-group:after {
- display: table;
- content: ""; }
-
-.wy-control-group:after {
- clear: both; }
-
-.wy-control-group:before, .wy-control-group:after {
- display: table;
- content: ""; }
-
-.wy-control-group:after {
- clear: both; }
-
-.wy-control-group.wy-control-group-required > label:after {
- content: " *";
- color: #E74C3C; }
-
-.wy-control-group .wy-form-full, .wy-control-group .wy-form-halves, .wy-control-group .wy-form-thirds {
- padding-bottom: 12px; }
-
-.wy-control-group .wy-form-full select, .wy-control-group .wy-form-halves select, .wy-control-group .wy-form-thirds select {
- width: 100%; }
-
-.wy-control-group .wy-form-full input[type="text"], .wy-control-group .wy-form-full input[type="password"], .wy-control-group .wy-form-full input[type="email"], .wy-control-group .wy-form-full input[type="url"], .wy-control-group .wy-form-full input[type="date"], .wy-control-group .wy-form-full input[type="month"], .wy-control-group .wy-form-full input[type="time"], .wy-control-group .wy-form-full input[type="datetime"], .wy-control-group .wy-form-full input[type="datetime-local"], .wy-control-group .wy-form-full input[type="week"], .wy-control-group .wy-form-full input[type="number"], .wy-control-group .wy-form-full input[type="search"], .wy-control-group .wy-form-full input[type="tel"], .wy-control-group .wy-form-full input[type="color"], .wy-control-group .wy-form-halves input[type="text"], .wy-control-group .wy-form-halves input[type="password"], .wy-control-group .wy-form-halves input[type="email"], .wy-control-group .wy-form-halves input[type="url"], .wy-control-group .wy-form-halves input[type="date"], .wy-control-group .wy-form-halves input[type="month"], .wy-control-group .wy-form-halves input[type="time"], .wy-control-group .wy-form-halves input[type="datetime"], .wy-control-group .wy-form-halves input[type="datetime-local"], .wy-control-group .wy-form-halves input[type="week"], .wy-control-group .wy-form-halves input[type="number"], .wy-control-group .wy-form-halves input[type="search"], .wy-control-group .wy-form-halves input[type="tel"], .wy-control-group .wy-form-halves input[type="color"], .wy-control-group .wy-form-thirds input[type="text"], .wy-control-group .wy-form-thirds input[type="password"], .wy-control-group .wy-form-thirds input[type="email"], .wy-control-group .wy-form-thirds input[type="url"], .wy-control-group .wy-form-thirds input[type="date"], .wy-control-group .wy-form-thirds input[type="month"], .wy-control-group .wy-form-thirds input[type="time"], .wy-control-group .wy-form-thirds input[type="datetime"], .wy-control-group .wy-form-thirds input[type="datetime-local"], .wy-control-group .wy-form-thirds input[type="week"], .wy-control-group .wy-form-thirds input[type="number"], .wy-control-group .wy-form-thirds input[type="search"], .wy-control-group .wy-form-thirds input[type="tel"], .wy-control-group .wy-form-thirds input[type="color"] {
- width: 100%; }
-
-.wy-control-group .wy-form-full {
- float: left;
- display: block;
- margin-right: 2.3576515979%;
- width: 100%;
- margin-right: 0; }
-
-.wy-control-group .wy-form-full:last-child {
- margin-right: 0; }
-
-.wy-control-group .wy-form-halves {
- float: left;
- display: block;
- margin-right: 2.3576515979%;
- width: 48.821174201%; }
-
-.wy-control-group .wy-form-halves:last-child {
- margin-right: 0; }
-
-.wy-control-group .wy-form-halves:nth-of-type(2n) {
- margin-right: 0; }
-
-.wy-control-group .wy-form-halves:nth-of-type(2n+1) {
- clear: left; }
-
-.wy-control-group .wy-form-thirds {
- float: left;
- display: block;
- margin-right: 2.3576515979%;
- width: 31.7615656014%; }
-
-.wy-control-group .wy-form-thirds:last-child {
- margin-right: 0; }
-
-.wy-control-group .wy-form-thirds:nth-of-type(3n) {
- margin-right: 0; }
-
-.wy-control-group .wy-form-thirds:nth-of-type(3n+1) {
- clear: left; }
-
-.wy-control-group.wy-control-group-no-input .wy-control {
- margin: 6px 0 0 0;
- font-size: 90%; }
-
-.wy-control-no-input {
- display: inline-block;
- margin: 6px 0 0 0;
- font-size: 90%; }
-
-.wy-control-group.fluid-input input[type="text"], .wy-control-group.fluid-input input[type="password"], .wy-control-group.fluid-input input[type="email"], .wy-control-group.fluid-input input[type="url"], .wy-control-group.fluid-input input[type="date"], .wy-control-group.fluid-input input[type="month"], .wy-control-group.fluid-input input[type="time"], .wy-control-group.fluid-input input[type="datetime"], .wy-control-group.fluid-input input[type="datetime-local"], .wy-control-group.fluid-input input[type="week"], .wy-control-group.fluid-input input[type="number"], .wy-control-group.fluid-input input[type="search"], .wy-control-group.fluid-input input[type="tel"], .wy-control-group.fluid-input input[type="color"] {
- width: 100%; }
-
-.wy-form-message-inline {
- display: inline-block;
- padding-left: 0.3em;
- color: #666;
- vertical-align: middle;
- font-size: 90%; }
-
-.wy-form-message {
- display: block;
- color: #999;
- font-size: 70%;
- margin-top: 0.3125em;
- font-style: italic; }
-
-.wy-form-message p {
- font-size: inherit;
- font-style: italic;
- margin-bottom: 6px; }
-
-.wy-form-message p:last-child {
- margin-bottom: 0; }
-
-input {
- line-height: normal; }
-
-input[type="button"], input[type="reset"], input[type="submit"] {
- -webkit-appearance: button;
- cursor: pointer;
- font-family: "Nunito", Arial, sans-serif;
- *overflow: visible; }
-
-input[type="text"], input[type="password"], input[type="email"], input[type="url"], input[type="date"], input[type="month"], input[type="time"], input[type="datetime"], input[type="datetime-local"], input[type="week"], input[type="number"], input[type="search"], input[type="tel"], input[type="color"] {
- -webkit-appearance: none;
- padding: 6px;
- display: inline-block;
- border: 1px solid #ccc;
- font-size: 80%;
- font-family: "Nunito", Arial, sans-serif;
- box-shadow: inset 0 1px 3px #ddd;
- border-radius: 0;
- -webkit-transition: border 0.3s linear;
- -moz-transition: border 0.3s linear;
- transition: border 0.3s linear; }
-
-input[type="datetime-local"] {
- padding: 0.34375em 0.625em; }
-
-input[disabled] {
- cursor: default; }
-
-input[type="checkbox"], input[type="radio"] {
- -webkit-box-sizing: border-box;
- -moz-box-sizing: border-box;
- box-sizing: border-box;
- padding: 0;
- margin-right: 0.3125em;
- *height: 13px;
- *width: 13px; }
-
-input[type="search"] {
- -webkit-box-sizing: border-box;
- -moz-box-sizing: border-box;
- box-sizing: border-box; }
-
-input[type="search"]::-webkit-search-cancel-button, input[type="search"]::-webkit-search-decoration {
- -webkit-appearance: none; }
-
-input[type="text"]:focus, input[type="password"]:focus, input[type="email"]:focus, input[type="url"]:focus, input[type="date"]:focus, input[type="month"]:focus, input[type="time"]:focus, input[type="datetime"]:focus, input[type="datetime-local"]:focus, input[type="week"]:focus, input[type="number"]:focus, input[type="search"]:focus, input[type="tel"]:focus, input[type="color"]:focus {
- outline: 0;
- outline: thin dotted \9;
- border-color: #333; }
-
-input.no-focus:focus {
- border-color: #ccc !important; }
-
-input[type="file"]:focus, input[type="radio"]:focus, input[type="checkbox"]:focus {
- outline: thin dotted #333;
- outline: 1px auto #129FEA; }
-
-input[type="text"][disabled], input[type="password"][disabled], input[type="email"][disabled], input[type="url"][disabled], input[type="date"][disabled], input[type="month"][disabled], input[type="time"][disabled], input[type="datetime"][disabled], input[type="datetime-local"][disabled], input[type="week"][disabled], input[type="number"][disabled], input[type="search"][disabled], input[type="tel"][disabled], input[type="color"][disabled] {
- cursor: not-allowed;
- background-color: #fafafa; }
-
-input:focus:invalid, textarea:focus:invalid, select:focus:invalid {
- color: #E74C3C;
- border: 1px solid #E74C3C; }
-
-input:focus:invalid:focus, textarea:focus:invalid:focus, select:focus:invalid:focus {
- border-color: #E74C3C; }
-
-input[type="file"]:focus:invalid:focus, input[type="radio"]:focus:invalid:focus, input[type="checkbox"]:focus:invalid:focus {
- outline-color: #E74C3C; }
-
-input.wy-input-large {
- padding: 12px;
- font-size: 100%; }
-
-textarea {
- overflow: auto;
- vertical-align: top;
- width: 100%;
- font-family: "Nunito", Arial, sans-serif; }
-
-select, textarea {
- padding: 0.5em 0.625em;
- display: inline-block;
- border: 1px solid #ccc;
- font-size: 80%;
- box-shadow: inset 0 1px 3px #ddd;
- -webkit-transition: border 0.3s linear;
- -moz-transition: border 0.3s linear;
- transition: border 0.3s linear; }
-
-select {
- border: 1px solid #ccc;
- background-color: #fff; }
-
-select[multiple] {
- height: auto; }
-
-select:focus, textarea:focus {
- outline: 0; }
-
-select[disabled], textarea[disabled], input[readonly], select[readonly], textarea[readonly] {
- cursor: not-allowed;
- background-color: #fafafa; }
-
-input[type="radio"][disabled], input[type="checkbox"][disabled] {
- cursor: not-allowed; }
-
-.wy-checkbox, .wy-radio {
- margin: 6px 0;
- color: #404040;
- display: block; }
-
-.wy-checkbox input, .wy-radio input {
- vertical-align: baseline; }
-
-.wy-form-message-inline {
- display: inline-block;
- *display: inline;
- *zoom: 1;
- vertical-align: middle; }
-
-.wy-input-prefix, .wy-input-suffix {
- white-space: nowrap;
- padding: 6px; }
-
-.wy-input-prefix .wy-input-context, .wy-input-suffix .wy-input-context {
- line-height: 27px;
- padding: 0 8px;
- display: inline-block;
- font-size: 80%;
- background-color: #f3f6f6;
- border: solid 1px #ccc;
- color: #999; }
-
-.wy-input-suffix .wy-input-context {
- border-left: 0; }
-
-.wy-input-prefix .wy-input-context {
- border-right: 0; }
-
-.wy-switch {
- position: relative;
- display: block;
- height: 24px;
- margin-top: 12px;
- cursor: pointer; }
-
-.wy-switch:before {
- position: absolute;
- content: "";
- display: block;
- left: 0;
- top: 0;
- width: 36px;
- height: 12px;
- border-radius: 4px;
- background: #ccc;
- -webkit-transition: all 0.2s ease-in-out;
- -moz-transition: all 0.2s ease-in-out;
- transition: all 0.2s ease-in-out; }
-
-.wy-switch:after {
- position: absolute;
- content: "";
- display: block;
- width: 18px;
- height: 18px;
- border-radius: 4px;
- background: #999;
- left: -3px;
- top: -3px;
- -webkit-transition: all 0.2s ease-in-out;
- -moz-transition: all 0.2s ease-in-out;
- transition: all 0.2s ease-in-out; }
-
-.wy-switch span {
- position: absolute;
- left: 48px;
- display: block;
- font-size: 12px;
- color: #ccc;
- line-height: 1; }
-
-.wy-switch.active:before {
- background: #1e8449; }
-
-.wy-switch.active:after {
- left: 24px;
- background: #27AE60; }
-
-.wy-switch.disabled {
- cursor: not-allowed;
- opacity: 0.8; }
-
-.wy-control-group.wy-control-group-error .wy-form-message, .wy-control-group.wy-control-group-error > label {
- color: #E74C3C; }
-
-.wy-control-group.wy-control-group-error input[type="text"], .wy-control-group.wy-control-group-error input[type="password"], .wy-control-group.wy-control-group-error input[type="email"], .wy-control-group.wy-control-group-error input[type="url"], .wy-control-group.wy-control-group-error input[type="date"], .wy-control-group.wy-control-group-error input[type="month"], .wy-control-group.wy-control-group-error input[type="time"], .wy-control-group.wy-control-group-error input[type="datetime"], .wy-control-group.wy-control-group-error input[type="datetime-local"], .wy-control-group.wy-control-group-error input[type="week"], .wy-control-group.wy-control-group-error input[type="number"], .wy-control-group.wy-control-group-error input[type="search"], .wy-control-group.wy-control-group-error input[type="tel"], .wy-control-group.wy-control-group-error input[type="color"] {
- border: solid 1px #E74C3C; }
-
-.wy-control-group.wy-control-group-error textarea {
- border: solid 1px #E74C3C; }
-
-.wy-inline-validate {
- white-space: nowrap; }
-
-.wy-inline-validate .wy-input-context {
- padding: 0.5em 0.625em;
- display: inline-block;
- font-size: 80%; }
-
-.wy-inline-validate.wy-inline-validate-success .wy-input-context {
- color: #27AE60; }
-
-.wy-inline-validate.wy-inline-validate-danger .wy-input-context {
- color: #E74C3C; }
-
-.wy-inline-validate.wy-inline-validate-warning .wy-input-context {
- color: #E67E22; }
-
-.wy-inline-validate.wy-inline-validate-info .wy-input-context {
- color: #2980B9; }
-
-.rotate-90 {
- -webkit-transform: rotate(90deg);
- -moz-transform: rotate(90deg);
- -ms-transform: rotate(90deg);
- -o-transform: rotate(90deg);
- transform: rotate(90deg); }
-
-.rotate-180 {
- -webkit-transform: rotate(180deg);
- -moz-transform: rotate(180deg);
- -ms-transform: rotate(180deg);
- -o-transform: rotate(180deg);
- transform: rotate(180deg); }
-
-.rotate-270 {
- -webkit-transform: rotate(270deg);
- -moz-transform: rotate(270deg);
- -ms-transform: rotate(270deg);
- -o-transform: rotate(270deg);
- transform: rotate(270deg); }
-
-.mirror {
- -webkit-transform: scaleX(-1);
- -moz-transform: scaleX(-1);
- -ms-transform: scaleX(-1);
- -o-transform: scaleX(-1);
- transform: scaleX(-1); }
-
-.mirror.rotate-90 {
- -webkit-transform: scaleX(-1) rotate(90deg);
- -moz-transform: scaleX(-1) rotate(90deg);
- -ms-transform: scaleX(-1) rotate(90deg);
- -o-transform: scaleX(-1) rotate(90deg);
- transform: scaleX(-1) rotate(90deg); }
-
-.mirror.rotate-180 {
- -webkit-transform: scaleX(-1) rotate(180deg);
- -moz-transform: scaleX(-1) rotate(180deg);
- -ms-transform: scaleX(-1) rotate(180deg);
- -o-transform: scaleX(-1) rotate(180deg);
- transform: scaleX(-1) rotate(180deg); }
-
-.mirror.rotate-270 {
- -webkit-transform: scaleX(-1) rotate(270deg);
- -moz-transform: scaleX(-1) rotate(270deg);
- -ms-transform: scaleX(-1) rotate(270deg);
- -o-transform: scaleX(-1) rotate(270deg);
- transform: scaleX(-1) rotate(270deg); }
-
-@media only screen and (max-width: 480px) {
- .wy-form button[type="submit"] {
- margin: 0.7em 0 0; }
- .wy-form input[type="text"], .wy-form input[type="password"], .wy-form input[type="email"], .wy-form input[type="url"], .wy-form input[type="date"], .wy-form input[type="month"], .wy-form input[type="time"], .wy-form input[type="datetime"], .wy-form input[type="datetime-local"], .wy-form input[type="week"], .wy-form input[type="number"], .wy-form input[type="search"], .wy-form input[type="tel"], .wy-form input[type="color"] {
- margin-bottom: 0.3em;
- display: block; }
- .wy-form label {
- margin-bottom: 0.3em;
- display: block; }
- .wy-form input[type="password"], .wy-form input[type="email"], .wy-form input[type="url"], .wy-form input[type="date"], .wy-form input[type="month"], .wy-form input[type="time"], .wy-form input[type="datetime"], .wy-form input[type="datetime-local"], .wy-form input[type="week"], .wy-form input[type="number"], .wy-form input[type="search"], .wy-form input[type="tel"], .wy-form input[type="color"] {
- margin-bottom: 0; }
- .wy-form-aligned .wy-control-group label {
- margin-bottom: 0.3em;
- text-align: left;
- display: block;
- width: 100%; }
- .wy-form-aligned .wy-control {
- margin: 1.5em 0 0 0; }
- .wy-form .wy-help-inline, .wy-form-message-inline, .wy-form-message {
- display: block;
- font-size: 80%;
- padding: 6px 0; } }
-
-@media screen and (max-width: 768px) {
- .tablet-hide {
- display: none; } }
-
-@media screen and (max-width: 480px) {
- .mobile-hide {
- display: none; } }
-
-.float-left {
- float: left; }
-
-.float-right {
- float: right; }
-
-.full-width {
- width: 100%; }
-
-.wy-table, .rst-content table.docutils, .rst-content table.field-list {
- border-collapse: collapse;
- border-spacing: 0;
- empty-cells: show;
- margin-bottom: 24px; }
-
-.wy-table caption, .rst-content table.docutils caption, .rst-content table.field-list caption {
- color: #000;
- font: italic 85%/1 arial, sans-serif;
- padding: 1em 0;
- text-align: center; }
-
-.wy-table td, .rst-content table.docutils td, .rst-content table.field-list td, .wy-table th, .rst-content table.docutils th, .rst-content table.field-list th {
- font-size: 90%;
- margin: 0;
- overflow: visible;
- padding: 8px 16px; }
-
-.wy-table td:first-child, .rst-content table.docutils td:first-child, .rst-content table.field-list td:first-child, .wy-table th:first-child, .rst-content table.docutils th:first-child, .rst-content table.field-list th:first-child {
- border-left-width: 0; }
-
-.wy-table thead, .rst-content table.docutils thead, .rst-content table.field-list thead {
- color: #000;
- text-align: left;
- vertical-align: bottom;
- white-space: nowrap; }
-
-.wy-table thead th, .rst-content table.docutils thead th, .rst-content table.field-list thead th {
- font-weight: 600;
- border-bottom: solid 2px #e1e4e5; }
-
-.wy-table td, .rst-content table.docutils td, .rst-content table.field-list td {
- background-color: transparent;
- vertical-align: middle; }
-
-.wy-table td p, .rst-content table.docutils td p, .rst-content table.field-list td p {
- line-height: 18px; }
-
-.wy-table td p:last-child, .rst-content table.docutils td p:last-child, .rst-content table.field-list td p:last-child {
- margin-bottom: 0; }
-
-.wy-table .wy-table-cell-min, .rst-content table.docutils .wy-table-cell-min, .rst-content table.field-list .wy-table-cell-min {
- width: 1%;
- padding-right: 0; }
-
-.wy-table .wy-table-cell-min input[type=checkbox], .rst-content table.docutils .wy-table-cell-min input[type=checkbox], .rst-content table.field-list .wy-table-cell-min input[type=checkbox], .wy-table .wy-table-cell-min input[type=checkbox], .rst-content table.docutils .wy-table-cell-min input[type=checkbox], .rst-content table.field-list .wy-table-cell-min input[type=checkbox] {
- margin: 0; }
-
-.wy-table-secondary {
- color: gray;
- font-size: 90%; }
-
-.wy-table-tertiary {
- color: gray;
- font-size: 80%; }
-
-.wy-table-odd td, .wy-table-striped tr:nth-child(2n-1) td, .rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td {
- background-color: #f3f6f6; }
-
-.wy-table-backed {
- background-color: #f3f6f6; }
-
-/* BORDERED TABLES */
-.wy-table-bordered-all, .rst-content table.docutils {
- border: 1px solid #e1e4e5; }
-
-.wy-table-bordered-all td, .rst-content table.docutils td {
- border-bottom: 1px solid #e1e4e5;
- border-left: 1px solid #e1e4e5; }
-
-.wy-table-bordered-all tbody > tr:last-child td, .rst-content table.docutils tbody > tr:last-child td {
- border-bottom-width: 0; }
-
-.wy-table-bordered {
- border: 1px solid #e1e4e5; }
-
-.wy-table-bordered-rows td {
- border-bottom: 1px solid #e1e4e5; }
-
-.wy-table-bordered-rows tbody > tr:last-child td {
- border-bottom-width: 0; }
-
-.wy-table-horizontal tbody > tr:last-child td {
- border-bottom-width: 0; }
-
-.wy-table-horizontal td, .wy-table-horizontal th {
- border-width: 0 0 1px 0;
- border-bottom: 1px solid #e1e4e5; }
-
-.wy-table-horizontal tbody > tr:last-child td {
- border-bottom-width: 0; }
-
-/* RESPONSIVE TABLES */
-.wy-table-responsive {
- margin-bottom: 24px;
- max-width: 100%;
- overflow: auto; }
-
-.wy-table-responsive table {
- margin-bottom: 0 !important; }
-
-.wy-table-responsive table td, .wy-table-responsive table th {
- white-space: nowrap; }
-
-a {
- color: #0099ee;
- text-decoration: none;
- cursor: pointer; }
-
-a:hover {
- color: #057eb6; }
-
-a:visited {
- color: #007ba8; }
-
-html {
- height: 100%;
- overflow-x: hidden; }
-
-body {
- font-family: "Nunito", Arial, sans-serif;
- font-weight: normal;
- color: #404040;
- min-height: 100%;
- overflow-x: hidden;
- background: #edf0f2; }
-
-.wy-text-left {
- text-align: left; }
-
-.wy-text-center {
- text-align: center; }
-
-.wy-text-right {
- text-align: right; }
-
-.wy-text-large {
- font-size: 120%; }
-
-.wy-text-normal {
- font-size: 100%; }
-
-.wy-text-small, small {
- font-size: 80%; }
-
-.wy-text-strike {
- text-decoration: line-through; }
-
-.wy-text-warning {
- color: #E67E22 !important; }
-
-a.wy-text-warning:hover {
- color: #eb9950 !important; }
-
-.wy-text-info {
- color: #2980B9 !important; }
-
-a.wy-text-info:hover {
- color: #409ad5 !important; }
-
-.wy-text-success {
- color: #27AE60 !important; }
-
-a.wy-text-success:hover {
- color: #36d278 !important; }
-
-.wy-text-danger {
- color: #E74C3C !important; }
-
-a.wy-text-danger:hover {
- color: #ed7669 !important; }
-
-.wy-text-neutral {
- color: #404040 !important; }
-
-a.wy-text-neutral:hover {
- color: #595959 !important; }
-
-h1, h2, .rst-content .toctree-wrapper p.caption, h3, h4, h5, h6, legend {
- margin-top: 0;
- font-weight: 700;
- font-family: "Nunito", Arial, sans-serif; }
-
-p {
- line-height: 24px;
- margin: 0;
- font-size: 16px;
- margin-bottom: 24px; }
-
-h1 {
- font-size: 36px; }
-
-h2, .rst-content .toctree-wrapper p.caption {
- font-size: 28px; }
-
-h3 {
- font-size: 20px; }
-
-h4 {
- font-size: 16px; }
-
-h5 {
- font-size: 16px; }
-
-h6 {
- font-size: 16px; }
-
-hr {
- display: block;
- height: 1px;
- border: 0;
- border-top: 1px solid #e1e4e5;
- margin: 24px 0;
- padding: 0; }
-
-code, .rst-content tt, .rst-content code {
- white-space: nowrap;
- max-width: 100%;
- background: #fff;
- border: solid 1px #e1e4e5;
- font-size: 90%;
- padding: 0 5px;
- font-family: "Operator mono", "Hack", "Menlo", Consolas, "Andale Mono WT", "Andale Mono", "Lucida Console", "Lucida Sans Typewriter", "DejaVu Sans Mono", "Bitstream Vera Sans Mono", "Liberation Mono", "Nimbus Mono L", Monaco, "Courier New", Courier, monospace;
- color: #E74C3C;
- overflow-x: auto; }
-
-code.code-large, .rst-content tt.code-large {
- font-size: 90%; }
-
-.wy-plain-list-disc, .rst-content .section ul, .rst-content .toctree-wrapper ul, article ul {
- list-style: disc;
- line-height: 24px;
- margin-bottom: 24px; }
-
-.wy-plain-list-disc li, .rst-content .section ul li, .rst-content .toctree-wrapper ul li, article ul li {
- list-style: disc;
- margin-left: 24px; }
-
-.wy-plain-list-disc li p:last-child, .rst-content .section ul li p:last-child, .rst-content .toctree-wrapper ul li p:last-child, article ul li p:last-child {
- margin-bottom: 0; }
-
-.wy-plain-list-disc li ul, .rst-content .section ul li ul, .rst-content .toctree-wrapper ul li ul, article ul li ul {
- margin-bottom: 0; }
-
-.wy-plain-list-disc li li, .rst-content .section ul li li, .rst-content .toctree-wrapper ul li li, article ul li li {
- list-style: circle; }
-
-.wy-plain-list-disc li li li, .rst-content .section ul li li li, .rst-content .toctree-wrapper ul li li li, article ul li li li {
- list-style: square; }
-
-.wy-plain-list-disc li ol li, .rst-content .section ul li ol li, .rst-content .toctree-wrapper ul li ol li, article ul li ol li {
- list-style: decimal; }
-
-.wy-plain-list-decimal, .rst-content .section ol, .rst-content ol.arabic, article ol {
- list-style: decimal;
- line-height: 24px;
- margin-bottom: 24px; }
-
-.wy-plain-list-decimal li, .rst-content .section ol li, .rst-content ol.arabic li, article ol li {
- list-style: decimal;
- margin-left: 24px; }
-
-.wy-plain-list-decimal li p:last-child, .rst-content .section ol li p:last-child, .rst-content ol.arabic li p:last-child, article ol li p:last-child {
- margin-bottom: 0; }
-
-.wy-plain-list-decimal li ul, .rst-content .section ol li ul, .rst-content ol.arabic li ul, article ol li ul {
- margin-bottom: 0; }
-
-.wy-plain-list-decimal li ul li, .rst-content .section ol li ul li, .rst-content ol.arabic li ul li, article ol li ul li {
- list-style: disc; }
-
-.codeblock-example {
- border: 1px solid #e1e4e5;
- border-bottom: none;
- padding: 24px;
- padding-top: 48px;
- font-weight: 500;
- background: #fff;
- position: relative; }
-
-.codeblock-example:after {
- content: "Example";
- position: absolute;
- top: 0px;
- left: 0px;
- background: #9B59B6;
- color: white;
- padding: 6px 12px; }
-
-.codeblock-example.prettyprint-example-only {
- border: 1px solid #e1e4e5;
- margin-bottom: 24px; }
-
-.codeblock, pre.literal-block, .rst-content .literal-block, .rst-content pre.literal-block, div[class^='highlight'] {
- padding: 0px;
- overflow-x: auto;
- background: #fff;
- margin: 1px 0 24px 0; }
-
-.codeblock div[class^='highlight'], pre.literal-block div[class^='highlight'], .rst-content .literal-block div[class^='highlight'], div[class^='highlight'] div[class^='highlight'] {
- border: none;
- background: #F5F7F9;
- margin: 0; }
-
-div[class^='highlight'] td.code {
- width: 100%; }
-
-.linenodiv pre {
- border-right: solid 1px #e6e9ea;
- margin: 0;
- padding: 12px 12px;
- font-family: "Operator mono", "Hack", "Menlo", Consolas, "Andale Mono WT", "Andale Mono", "Lucida Console", "Lucida Sans Typewriter", "DejaVu Sans Mono", "Bitstream Vera Sans Mono", "Liberation Mono", "Nimbus Mono L", Monaco, "Courier New", Courier, monospace;
- font-size: 14px;
- line-height: 1.5;
- color: #d9d9d9; }
-
-div[class^='highlight'] pre {
- white-space: pre;
- margin: 0;
- padding: 12px 12px;
- font-family: "Operator mono", "Hack", "Menlo", Consolas, "Andale Mono WT", "Andale Mono", "Lucida Console", "Lucida Sans Typewriter", "DejaVu Sans Mono", "Bitstream Vera Sans Mono", "Liberation Mono", "Nimbus Mono L", Monaco, "Courier New", Courier, monospace;
- font-size: 14px;
- line-height: 1.5;
- display: block;
- overflow: auto;
- color: #404040; }
-
-@media print {
- .codeblock, pre.literal-block, .rst-content .literal-block, .rst-content pre.literal-block, div[class^='highlight'], div[class^='highlight'] pre {
- white-space: pre-wrap; } }
-
-.hll {
- background-color: #ffffcc;
- margin: 0 -12px;
- padding: 0 12px;
- display: block; }
-
-.c {
- color: #999988;
- font-style: italic; }
-
-.err {
- color: #a61717;
- background-color: #e3d2d2; }
-
-.k {
- font-weight: bold; }
-
-.o {
- font-weight: bold; }
-
-.cm {
- color: #999988;
- font-style: italic; }
-
-.cp {
- color: #999999;
- font-weight: bold; }
-
-.c1 {
- color: #999988;
- font-style: italic; }
-
-.cs {
- color: #999999;
- font-weight: bold;
- font-style: italic; }
-
-.gd {
- color: #000000;
- background-color: #ffdddd; }
-
-.gd .x {
- color: #000000;
- background-color: #ffaaaa; }
-
-.ge {
- font-style: italic; }
-
-.gr {
- color: #aa0000; }
-
-.gh {
- color: #999999; }
-
-.gi {
- color: #000000;
- background-color: #ddffdd; }
-
-.gi .x {
- color: #000000;
- background-color: #aaffaa; }
-
-.go {
- color: #888888; }
-
-.gp {
- color: #555555; }
-
-.gs {
- font-weight: bold; }
-
-.gu {
- color: #800080;
- font-weight: bold; }
-
-.gt {
- color: #aa0000; }
-
-.kc {
- font-weight: bold; }
-
-.kd {
- font-weight: bold; }
-
-.kn {
- font-weight: bold; }
-
-.kp {
- font-weight: bold; }
-
-.kr {
- font-weight: bold; }
-
-.kt {
- color: #445588;
- font-weight: bold; }
-
-.m {
- color: #009999; }
-
-.s {
- color: #dd1144; }
-
-.n {
- color: #333333; }
-
-.na {
- color: teal; }
-
-.nb {
- color: #0086b3; }
-
-.nc {
- color: #445588;
- font-weight: bold; }
-
-.no {
- color: teal; }
-
-.ni {
- color: purple; }
-
-.ne {
- color: #990000;
- font-weight: bold; }
-
-.nf {
- color: #990000;
- font-weight: bold; }
-
-.nn {
- color: #555555; }
-
-.nt {
- color: navy; }
-
-.nv {
- color: teal; }
-
-.ow {
- font-weight: bold; }
-
-.w {
- color: #bbbbbb; }
-
-.mf {
- color: #009999; }
-
-.mh {
- color: #009999; }
-
-.mi {
- color: #009999; }
-
-.mo {
- color: #009999; }
-
-.sb {
- color: #dd1144; }
-
-.sc {
- color: #dd1144; }
-
-.sd {
- color: #dd1144; }
-
-.s2 {
- color: #dd1144; }
-
-.se {
- color: #dd1144; }
-
-.sh {
- color: #dd1144; }
-
-.si {
- color: #dd1144; }
-
-.sx {
- color: #dd1144; }
-
-.sr {
- color: #009926; }
-
-.s1 {
- color: #dd1144; }
-
-.ss {
- color: #990073; }
-
-.bp {
- color: #999999; }
-
-.vc {
- color: teal; }
-
-.vg {
- color: teal; }
-
-.vi {
- color: teal; }
-
-.il {
- color: #009999; }
-
-.gc {
- color: #999;
- background-color: #EAF2F5; }
-
-.wy-breadcrumbs li {
- display: inline-block; }
-
-.wy-breadcrumbs li.wy-breadcrumbs-aside {
- float: right; }
-
-.wy-breadcrumbs li a {
- display: inline-block;
- padding: 5px; }
-
-.wy-breadcrumbs li a:first-child {
- padding-left: 0; }
-
-.wy-breadcrumbs li code, .wy-breadcrumbs li .rst-content tt, .rst-content .wy-breadcrumbs li tt {
- padding: 5px;
- border: none;
- background: none; }
-
-.wy-breadcrumbs li code.literal, .wy-breadcrumbs li .rst-content tt.literal, .rst-content .wy-breadcrumbs li tt.literal {
- color: #404040; }
-
-.wy-breadcrumbs-extra {
- margin-bottom: 0;
- color: #b3b3b3;
- font-size: 80%;
- display: inline-block; }
-
-@media screen and (max-width: 480px) {
- .wy-breadcrumbs-extra {
- display: none; }
- .wy-breadcrumbs li.wy-breadcrumbs-aside {
- display: none; } }
-
-@media print {
- .wy-breadcrumbs li.wy-breadcrumbs-aside {
- display: none; } }
-
-.wy-affix {
- position: fixed;
- top: 0; }
-
-.wy-menu a:hover {
- text-decoration: none; }
-
-.wy-menu-horiz {
- *zoom: 1; }
-
-.wy-menu-horiz:before, .wy-menu-horiz:after {
- display: table;
- content: ""; }
-
-.wy-menu-horiz:after {
- clear: both; }
-
-.wy-menu-horiz ul, .wy-menu-horiz li {
- display: inline-block; }
-
-.wy-menu-horiz li:hover {
- background: rgba(255, 255, 255, 0.1); }
-
-.wy-menu-horiz li.divide-left {
- border-left: solid 1px #404040; }
-
-.wy-menu-horiz li.divide-right {
- border-right: solid 1px #404040; }
-
-.wy-menu-horiz a {
- height: 32px;
- display: inline-block;
- line-height: 32px;
- padding: 0 16px; }
-
-.wy-menu-vertical {
- width: 300px; }
-
-.wy-menu-vertical header, .wy-menu-vertical p.caption {
- height: 32px;
- display: inline-block;
- line-height: 32px;
- padding: 0 36px;
- margin-bottom: 0;
- display: block;
- font-weight: 600;
- text-transform: uppercase;
- font-size: 80%;
- color: #6f6f6f;
- white-space: nowrap; }
-
-.wy-menu-vertical ul {
- margin-bottom: 0; }
-
-.wy-menu-vertical li.current a {
- color: gray;
- padding: 0.4045em 20px 0.4045em 50px; }
-
-.wy-menu-vertical li code, .wy-menu-vertical li .rst-content tt, .rst-content .wy-menu-vertical li tt {
- border: none;
- background: inherit;
- color: inherit;
- padding-left: 0;
- padding-right: 0; }
-
-.wy-menu-vertical li span.toctree-expand {
- display: block;
- float: left;
- margin-left: -1.2em;
- font-size: 0.8em;
- line-height: 1.6em;
- color: #4d4d4d; }
-
-.wy-menu-vertical li.on a, .wy-menu-vertical li.current > a {
- color: #404040;
- padding: 0.4045em 36px;
- font-weight: 600;
- position: relative;
- border: none; }
-
-.wy-menu-vertical li.on a:hover span.toctree-expand, .wy-menu-vertical li.current > a:hover span.toctree-expand {
- color: gray; }
-
-.wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand {
- display: block;
- font-size: 0.8em;
- line-height: 1.6em;
- color: #333333; }
-
-.wy-menu-vertical li.toctree-l1.current li.toctree-l2 > ul, .wy-menu-vertical li.toctree-l2.current li.toctree-l3 > ul {
- display: none; }
-
-.wy-menu-vertical li.toctree-l1.current li.toctree-l2.current > ul, .wy-menu-vertical li.toctree-l2.current li.toctree-l3.current > ul {
- display: block; }
-
-.toctree-l2 {
- font-size: 14px; }
-
-.wy-menu-vertical li.toctree-l2.current > a {
- padding: 0.4045em 20px 0.4045em 52px;
- font-size: 14px; }
-
-.wy-menu-vertical li.toctree-l2.current li.toctree-l3 > a {
- display: block;
- padding: 0.4045em 20px 0.4045em 72px; }
-
-.wy-menu-vertical li.toctree-l2 a:hover span.toctree-expand {
- color: gray; }
-
-.wy-menu-vertical li.toctree-l2 span.toctree-expand {
- color: #a3a3a3; }
-
-.wy-menu-vertical .current > a > span.toctree-expand:before {
- margin-left: -2px; }
-
-.toctree-expand:before {
- margin-top: 1px; }
-
-.wy-menu-vertical li.toctree-l3 {
- font-size: 14px; }
-
-.wy-menu-vertical li.toctree-l3.current > a {
- padding: 0.4045em 20px 0.4045em 72px; }
-
-.wy-menu-vertical li.toctree-l3.current li.toctree-l4 > a {
- display: block;
- padding: 0.4045em 20px 0.4045em 92px;
- border-top: none;
- border-bottom: none; }
-
-.wy-menu-vertical li.toctree-l3 a:hover span.toctree-expand {
- color: gray; }
-
-.wy-menu-vertical li.toctree-l3 span.toctree-expand {
- color: #969696; }
-
-.wy-menu-vertical li.toctree-l4 {
- font-size: 14px; }
-
-.wy-menu-vertical li.current ul {
- display: block; }
-
-.wy-menu-vertical li ul {
- margin-bottom: 0;
- display: none; }
-
-.wy-menu-vertical .local-toc li ul {
- display: block; }
-
-.wy-menu-vertical li ul li a {
- margin-bottom: 0;
- color: #b3b3b3;
- font-weight: normal; }
-
-.wy-menu-vertical a {
- display: inline-block;
- padding: 0.4045em 36px;
- display: block;
- position: relative;
- color: #b3b3b3; }
-
-.wy-menu-vertical a:hover {
- cursor: pointer; }
-
-.wy-menu-vertical a:hover span.toctree-expand {
- color: #b3b3b3; }
-
-.wy-menu-vertical a:active {
- cursor: pointer; }
-
-.wy-side-nav-search {
- display: block;
- width: 100%;
- z-index: 200;
- text-align: center;
- display: block;
- color: #fcfcfc; }
-
-.wy-side-nav-search input[type=text] {
- width: 100%;
- border-radius: 5px;
- padding: 6px 33px 6px 12px;
- border: none;
- height: auto;
- font-size: 14px;
- box-shadow: none; }
-
-.wy-side-nav-search img {
- display: block;
- margin: auto auto 0.809em auto;
- height: 45px;
- width: 45px;
- background-color: #2980B9;
- padding: 5px;
- border-radius: 100%; }
-
-.wy-side-nav-search > a, .wy-side-nav-search .wy-dropdown > a {
- color: black;
- width: 75%;
- font-size: 100%;
- font-weight: 600;
- display: inline-block;
- padding: 4px 6px;
- margin-bottom: 0.809em; }
-
-.wy-side-nav-search > a:hover, .wy-side-nav-search .wy-dropdown > a:hover {
- background: rgba(255, 255, 255, 0.1); }
-
-.wy-side-nav-search > a img.logo, .wy-side-nav-search .wy-dropdown > a img.logo {
- display: block;
- margin: 0;
- height: 100%;
- width: 100%;
- border-radius: 0;
- background: transparent; }
-
-.wy-side-nav-search > a.icon img.logo, .wy-side-nav-search .wy-dropdown > a.icon img.logo {
- margin: 0; }
-
-.wy-side-nav-search > div.version {
- margin-top: -0.4045em;
- margin-bottom: 0.809em;
- font-weight: normal;
- color: rgba(255, 255, 255, 0.3); }
-
-.wy-nav .wy-menu-vertical header {
- color: #2980B9; }
-
-.wy-nav .wy-menu-vertical a {
- color: #b3b3b3; }
-
-.wy-nav .wy-menu-vertical a:hover {
- background-color: #2980B9;
- color: #fff; }
-
-[data-menu-wrap] {
- -webkit-transition: all 0.2s ease-in;
- -moz-transition: all 0.2s ease-in;
- transition: all 0.2s ease-in;
- position: absolute;
- opacity: 1;
- width: 100%;
- opacity: 0; }
-
-[data-menu-wrap].move-center {
- left: 0;
- right: auto;
- opacity: 1; }
-
-[data-menu-wrap].move-left {
- right: auto;
- left: -100%;
- opacity: 0; }
-
-[data-menu-wrap].move-right {
- right: -100%;
- left: auto;
- opacity: 0; }
-
-.wy-grid-for-nav {
- position: absolute;
- width: 100%;
- height: 100%; }
-
-.wy-nav-side {
- position: fixed;
- bottom: 0;
- left: 0;
- top: 0;
- padding-bottom: 60px;
- width: 300px;
- overflow-x: hidden;
- overflow-y: hidden;
- min-height: 100%;
- background: #343131;
- z-index: 200; }
- @media (min-width: 768px) {
- .wy-nav-side {
- top: 0px;
- min-height: calc(100% - 60px); } }
-
-.wy-side-scroll {
- width: 320px;
- position: relative;
- overflow-x: hidden;
- overflow-y: scroll;
- padding-bottom: 10px;
- height: 100%; }
-
-.wy-nav-top {
- display: none;
- background: #2980B9;
- color: #fff;
- padding: 0.4045em 0.809em;
- position: relative;
- line-height: 50px;
- text-align: center;
- font-size: 100%;
- *zoom: 1; }
-
-.wy-nav-top:before, .wy-nav-top:after {
- display: table;
- content: ""; }
-
-.wy-nav-top:after {
- clear: both; }
-
-.wy-nav-top a {
- color: #fff;
- font-weight: 600; }
-
-.wy-nav-top img {
- margin-right: 12px;
- height: 45px;
- width: 45px;
- background-color: #2980B9;
- padding: 5px;
- border-radius: 100%; }
-
-.wy-nav-top i {
- font-size: 30px;
- float: left;
- cursor: pointer;
- padding-top: inherit; }
-
-.wy-nav-content-wrap {
- margin-left: 300px;
- min-height: 100%; }
-
-.wy-nav-content {
- padding: 24px 46px 18px;
- height: 100%;
- max-width: 850px; }
-
-.wy-body-mask {
- position: fixed;
- width: 100%;
- height: 100%;
- background: rgba(0, 0, 0, 0.2);
- display: none;
- z-index: 499; }
-
-.wy-body-mask.on {
- display: block; }
-
-footer {
- color: gray; }
-
-footer p {
- margin-bottom: 12px; }
-
-footer span.commit code, footer span.commit .rst-content tt, .rst-content footer span.commit tt {
- padding: 0px;
- font-family: "Operator mono", "Hack", "Menlo", Consolas, "Andale Mono WT", "Andale Mono", "Lucida Console", "Lucida Sans Typewriter", "DejaVu Sans Mono", "Bitstream Vera Sans Mono", "Liberation Mono", "Nimbus Mono L", Monaco, "Courier New", Courier, monospace;
- font-size: 1em;
- background: none;
- border: none;
- color: gray; }
-
-.rst-footer-buttons {
- *zoom: 1; }
-
-.rst-footer-buttons:before, .rst-footer-buttons:after {
- width: 100%; }
-
-.rst-footer-buttons:before, .rst-footer-buttons:after {
- display: table;
- content: ""; }
-
-.rst-footer-buttons:after {
- clear: both; }
-
-.rst-breadcrumbs-buttons {
- margin-top: 12px;
- *zoom: 1; }
-
-.rst-breadcrumbs-buttons:before, .rst-breadcrumbs-buttons:after {
- display: table;
- content: ""; }
-
-.rst-breadcrumbs-buttons:after {
- clear: both; }
-
-#search-results .search li {
- margin-bottom: 24px;
- border-bottom: solid 1px #e1e4e5;
- padding-bottom: 24px; }
-
-#search-results .search li:first-child {
- border-top: solid 1px #e1e4e5;
- padding-top: 24px; }
-
-#search-results .search li a {
- font-size: 120%;
- margin-bottom: 12px;
- display: inline-block; }
-
-#search-results .context {
- color: gray;
- font-size: 90%; }
-
-@media screen and (max-width: 768px) {
- .wy-body-for-nav {
- background: #fcfcfc; }
- .wy-nav-top {
- display: block; }
- .wy-nav-side {
- left: -300px; }
- .wy-nav-side.shift {
- width: 85%;
- left: 0; }
- .wy-side-scroll {
- width: auto; }
- .wy-side-nav-search {
- width: auto; }
- .wy-menu.wy-menu-vertical {
- width: auto; }
- .wy-nav-content-wrap {
- margin-left: 0; }
- .wy-nav-content-wrap .wy-nav-content {
- padding: 26px 20px; }
- .wy-nav-content-wrap.shift {
- position: fixed;
- min-width: 100%;
- left: 85%;
- height: 100%;
- overflow: hidden;
- top: 0; } }
- @media screen and (max-width: 768px) and (min-width: 768px) {
- .wy-nav-content-wrap.shift {
- top: 60px; } }
-
-@media screen and (min-width: 1400px) {
- .wy-nav-content {
- margin: 0; } }
-
-@media print {
- .rst-versions, footer, .wy-nav-side {
- display: none; }
- .wy-nav-content-wrap {
- margin-left: 0; } }
-
-.rst-versions {
- position: fixed;
- bottom: 0;
- left: 0;
- width: 300px;
- color: black;
- background: white;
- z-index: 400;
- border-top-left-radius: 5px;
- border-top-right-radius: 5px;
- box-shadow: rgba(0, 0, 0, 0.25) 0px 2px 4px; }
-
-.rst-versions dt {
- color: black;
- font-weight: 600; }
-
-.rst-versions a {
- color: #2980B9;
- text-decoration: none; }
-
-.rst-versions .rst-badge-small {
- display: none; }
-
-.rst-versions .rst-current-version {
- padding: 20px;
- display: flex;
- text-align: right;
- font-size: 90%;
- cursor: pointer;
- color: black;
- *zoom: 1;
- justify-content: space-between;
- z-index: 999;
- position: relative; }
- .rst-versions .rst-current-version .fa {
- font-size: 12px; }
-
-.rst-versions .rst-current-version .fa, .rst-versions .rst-current-version .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li .rst-versions .rst-current-version span.toctree-expand, .rst-versions .rst-current-version .rst-content .admonition-title, .rst-content .rst-versions .rst-current-version .admonition-title, .rst-versions .rst-current-version .rst-content h1 .headerlink, .rst-content h1 .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content h2 .headerlink, .rst-content h2 .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content h3 .headerlink, .rst-content h3 .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content h4 .headerlink, .rst-content h4 .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content h5 .headerlink, .rst-content h5 .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content h6 .headerlink, .rst-content h6 .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content dl dt .headerlink, .rst-content dl dt .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content p.caption .headerlink, .rst-content p.caption .rst-versions .rst-current-version .headerlink, .rst-versions .rst-current-version .rst-content tt.download span:first-child, .rst-content tt.download .rst-versions .rst-current-version span:first-child, .rst-versions .rst-current-version .rst-content code.download span:first-child, .rst-content code.download .rst-versions .rst-current-version span:first-child, .rst-versions .rst-current-version .icon {
- color: black; }
-
-.rst-versions .rst-current-version.rst-out-of-date {
- background-color: #E74C3C;
- color: #fff; }
-
-.rst-versions .rst-current-version.rst-active-old-version {
- background-color: #F1C40F;
- color: #000; }
-
-.rst-versions.shift-up .rst-other-versions {
- display: block; }
-
-.rst-versions .rst-other-versions {
- font-size: 14px;
- color: gray;
- display: none;
- font: 0/0 a;
- padding: 0 !important; }
- .rst-versions .rst-other-versions .rst-other-versions {
- padding: 20px !important;
- margin-bottom: 40px; }
- .rst-versions .rst-other-versions .rst-current-version {
- display: none !important; }
- .rst-versions .rst-other-versions dl {
- font-family: "Nunito", sans-serif;
- font-size: 14px;
- line-height: 20px; }
- .rst-versions .rst-other-versions hr {
- display: none; }
-
-.rst-versions .rst-other-versions dd {
- display: inline-block;
- margin: 0; }
-
-.rst-versions .rst-other-versions dt + dd {
- margin-left: -6px; }
-
-.rst-versions .rst-other-versions dt + strong {
- margin-left: -6px; }
-
-.rst-versions .rst-other-versions dd a {
- display: inline-block;
- padding: 6px;
- color: black; }
-
-@media screen and (max-width: 768px) {
- .rst-versions {
- width: 85%;
- display: none; }
- .rst-versions.shift {
- display: block; } }
-
-.rst-content img {
- max-width: 100%;
- height: auto !important; }
-
-.rst-content div.figure {
- margin-bottom: 24px; }
-
-.rst-content div.figure p.caption {
- font-style: italic; }
-
-.rst-content div.figure.align-center {
- text-align: center; }
-
-.rst-content .section > img, .rst-content .section > a > img {
- margin-bottom: 24px; }
-
-.rst-content blockquote {
- padding-left: 20px;
- line-height: 24px;
- margin-bottom: 24px;
- border-left: 4px solid #E1E4E5; }
-
-.rst-content .note .last, .rst-content .attention .last, .rst-content .caution .last, .rst-content .danger .last, .rst-content .error .last, .rst-content .hint .last, .rst-content .important .last, .rst-content .tip .last, .rst-content .warning .last, .rst-content .seealso .last, .rst-content .admonition-todo .last, .rst-content .admonition .last {
- margin-bottom: 0; }
-
-.rst-content .admonition-title:before {
- margin-right: 4px; }
-
-.rst-content .admonition table {
- border-color: rgba(0, 0, 0, 0.1); }
-
-.rst-content .admonition table td, .rst-content .admonition table th {
- background: transparent !important;
- border-color: rgba(0, 0, 0, 0.1) !important; }
-
-.rst-content .section ol.loweralpha, .rst-content .section ol.loweralpha li {
- list-style: lower-alpha; }
-
-.rst-content .section ol.upperalpha, .rst-content .section ol.upperalpha li {
- list-style: upper-alpha; }
-
-.rst-content .section ol p, .rst-content .section ul p {
- margin-bottom: 12px; }
-
-.rst-content .line-block {
- margin-left: 24px; }
-
-.rst-content .topic-title {
- font-weight: 600;
- margin-bottom: 12px; }
-
-.rst-content .toc-backref {
- color: #404040; }
-
-.rst-content .align-right {
- float: right;
- margin: 0px 0px 24px 24px; }
-
-.rst-content .align-left {
- float: left;
- margin: 0px 24px 24px 0px; }
-
-.rst-content .align-center {
- margin: auto;
- display: block; }
-
-.rst-content h1 .headerlink, .rst-content h2 .headerlink, .rst-content .toctree-wrapper p.caption .headerlink, .rst-content h3 .headerlink, .rst-content h4 .headerlink, .rst-content h5 .headerlink, .rst-content h6 .headerlink, .rst-content dl dt .headerlink, .rst-content p.caption .headerlink {
- display: none;
- visibility: hidden;
- font-size: 14px; }
-
-.rst-content h1 .headerlink:after, .rst-content h2 .headerlink:after, .rst-content .toctree-wrapper p.caption .headerlink:after, .rst-content h3 .headerlink:after, .rst-content h4 .headerlink:after, .rst-content h5 .headerlink:after, .rst-content h6 .headerlink:after, .rst-content dl dt .headerlink:after, .rst-content p.caption .headerlink:after {
- visibility: visible;
- content: "";
- font-family: FontAwesome;
- display: inline-block; }
-
-.rst-content h1:hover .headerlink, .rst-content h2:hover .headerlink, .rst-content .toctree-wrapper p.caption:hover .headerlink, .rst-content h3:hover .headerlink, .rst-content h4:hover .headerlink, .rst-content h5:hover .headerlink, .rst-content h6:hover .headerlink, .rst-content dl dt:hover .headerlink, .rst-content p.caption:hover .headerlink {
- display: inline-block; }
-
-.rst-content .centered {
- text-align: center; }
-
-.rst-content .sidebar {
- float: right;
- width: 40%;
- display: block;
- margin: 0 0 24px 24px;
- padding: 24px;
- background: #f3f6f6;
- border: solid 1px #e1e4e5; }
-
-.rst-content .sidebar p, .rst-content .sidebar ul, .rst-content .sidebar dl {
- font-size: 90%; }
-
-.rst-content .sidebar .last {
- margin-bottom: 0; }
-
-.rst-content .sidebar .sidebar-title {
- display: block;
- font-family: "Nunito", Arial, sans-serif;
- font-weight: 600;
- background: #e1e4e5;
- padding: 6px 12px;
- margin: -24px;
- margin-bottom: 24px;
- font-size: 100%; }
-
-.rst-content .highlighted {
- background: #F1C40F;
- display: inline-block;
- font-weight: 600;
- padding: 0 6px; }
-
-.rst-content .footnote-reference, .rst-content .citation-reference {
- vertical-align: super;
- font-size: 90%; }
-
-.rst-content table.docutils.citation, .rst-content table.docutils.footnote {
- background: none;
- border: none;
- color: gray; }
-
-.rst-content table.docutils.citation td, .rst-content table.docutils.citation tr, .rst-content table.docutils.footnote td, .rst-content table.docutils.footnote tr {
- border: none;
- background-color: transparent !important;
- white-space: normal; }
-
-.rst-content table.docutils.citation td.label, .rst-content table.docutils.footnote td.label {
- padding-left: 0;
- padding-right: 0;
- vertical-align: top; }
-
-.rst-content table.docutils.citation tt, .rst-content table.docutils.citation code, .rst-content table.docutils.footnote tt, .rst-content table.docutils.footnote code {
- color: #555; }
-
-.rst-content table.field-list {
- border: none; }
-
-.rst-content table.field-list td {
- border: none; }
-
-.rst-content table.field-list td > strong {
- display: inline-block; }
-
-.rst-content table.field-list .field-name {
- padding-right: 10px;
- text-align: left;
- white-space: nowrap; }
-
-.rst-content table.field-list .field-body {
- text-align: left; }
-
-.rst-content tt, .rst-content tt, .rst-content code {
- color: #000;
- padding: 2px 5px; }
-
-.rst-content tt big, .rst-content tt em, .rst-content tt big, .rst-content code big, .rst-content tt em, .rst-content code em {
- font-size: 100% !important;
- line-height: normal; }
-
-.rst-content tt.literal, .rst-content tt.literal, .rst-content code.literal {
- color: #E74C3C; }
-
-.rst-content tt.xref, a .rst-content tt, .rst-content tt.xref, .rst-content code.xref, a .rst-content tt, a .rst-content code {
- font-weight: 600;
- color: #404040; }
-
-.rst-content a tt, .rst-content a tt, .rst-content a code {
- color: #2980B9; }
-
-.rst-content dl {
- margin-bottom: 24px; }
-
-.rst-content dl dt {
- font-weight: 600; }
-
-.rst-content dl p, .rst-content dl table, .rst-content dl ul, .rst-content dl ol {
- margin-bottom: 12px !important; }
-
-.rst-content dl dd {
- margin: 0 0 12px 24px; }
-
-.rst-content dl:not(.docutils) {
- margin-bottom: 24px; }
-
-.rst-content dl:not(.docutils) dt {
- display: table;
- margin: 6px 0;
- font-size: 90%;
- line-height: normal;
- background: #e7f2fa;
- color: #2980B9;
- border-top: solid 3px #6ab0de;
- padding: 6px;
- position: relative; }
-
-.rst-content dl:not(.docutils) dt:before {
- color: #6ab0de; }
-
-.rst-content dl:not(.docutils) dt .headerlink {
- color: #404040;
- font-size: 100% !important; }
-
-.rst-content dl:not(.docutils) dl dt {
- margin-bottom: 6px;
- border: none;
- border-left: solid 3px #cccccc;
- background: #f0f0f0;
- color: #555; }
-
-.rst-content dl:not(.docutils) dl dt .headerlink {
- color: #404040;
- font-size: 100% !important; }
-
-.rst-content dl:not(.docutils) dt:first-child {
- margin-top: 0; }
-
-.rst-content dl:not(.docutils) tt, .rst-content dl:not(.docutils) tt, .rst-content dl:not(.docutils) code {
- font-weight: 600; }
-
-.rst-content dl:not(.docutils) tt.descname, .rst-content dl:not(.docutils) tt.descclassname, .rst-content dl:not(.docutils) tt.descname, .rst-content dl:not(.docutils) code.descname, .rst-content dl:not(.docutils) tt.descclassname, .rst-content dl:not(.docutils) code.descclassname {
- background-color: transparent;
- border: none;
- padding: 0;
- font-size: 100% !important; }
-
-.rst-content dl:not(.docutils) tt.descname, .rst-content dl:not(.docutils) tt.descname, .rst-content dl:not(.docutils) code.descname {
- font-weight: 600; }
-
-.rst-content dl:not(.docutils) .optional {
- display: inline-block;
- padding: 0 4px;
- color: #000;
- font-weight: 600; }
-
-.rst-content dl:not(.docutils) .property {
- display: inline-block;
- padding-right: 8px; }
-
-.rst-content .viewcode-link, .rst-content .viewcode-back {
- display: inline-block;
- color: #27AE60;
- font-size: 80%;
- padding-left: 24px; }
-
-.rst-content .viewcode-back {
- display: block;
- float: right; }
-
-.rst-content p.rubric {
- margin-bottom: 12px;
- font-weight: 600; }
-
-.rst-content tt.download, .rst-content code.download {
- background: inherit;
- padding: inherit;
- font-weight: normal;
- font-family: inherit;
- font-size: inherit;
- color: inherit;
- border: inherit;
- white-space: inherit; }
-
-.rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before {
- margin-right: 4px; }
-
-.rst-content .guilabel {
- border: 1px solid #7fbbe3;
- background: #e7f2fa;
- font-size: 80%;
- font-weight: 700;
- border-radius: 4px;
- padding: 2.4px 6px;
- margin: auto 2px; }
-
-.rst-content .versionmodified {
- font-style: italic; }
-
-@media screen and (max-width: 480px) {
- .rst-content .sidebar {
- width: 100%; } }
-
-span[id*='MathJax-Span'] {
- color: #404040; }
-
-.math {
- text-align: center; }
-
-@media (min-width: 768px) {
- body {
- padding-top: 0px; } }
-
-.wy-side-nav-search {
- color: black; }
-
-.wy-side-nav-search > div.version {
- color: rgba(0, 0, 0, 0.3); }
-
-.wy-nav-side,
-.wy-side-nav-search {
- background-color: #f5f7f9; }
-
-.wy-nav-content-wrap {
- background-color: white; }
-
-.wy-menu-vertical li a,
-.wy-menu-vertical li ul li a,
-.wy-menu-vertical li.current a {
- color: black; }
-
-.header {
- display: flex;
- background-color: white;
- width: 100%;
- border-bottom: 1px solid #e1e4e5;
- z-index: 999;
- height: 60px;
- padding: 0 20px;
- flex-direction: row;
- align-items: center;
- transform: translateZ(0);
- left: 0;
- top: 0;
- position: absolute; }
- @media (min-width: 768px) {
- .header {
- position: fixed; } }
- .header .logo-link {
- display: block;
- margin-left: auto; }
- @media (max-width: 767px) {
- .header .logo-link {
- display: none; } }
- .header .fa-bars {
- cursor: pointer;
- color: black;
- display: none;
- font-size: 30px;
- margin-right: 20px; }
- @media (max-width: 767px) {
- .header .fa-bars {
- display: block; } }
-
-.header-title-wrap {
- width: 280px; }
-
-.header-title {
- font-size: 20px; }
- .header-title, .header-title:hover, .header-title:visited, .header-title:focus {
- color: black;
- text-decoration: none; }
-
-.logo {
- height: 16px; }
-
-.wy-breadcrumbs {
- font-size: 16px;
- font-weight: 600;
- min-height: 30px;
- margin-bottom: 20px; }
- .wy-breadcrumbs li a {
- padding: 0; }
-
-.wy-side-nav-search {
- height: auto;
- border-bottom: 1px solid #e1e4e5;
- padding-left: 20px;
- padding-right: 20px;
- display: block;
- align-items: flex-start;
- margin-bottom: 20px; }
- @media (max-width: 767px) {
- .wy-side-nav-search {
- height: 60px; } }
- .wy-side-nav-search .search-form,
- .wy-side-nav-search .search-form form {
- width: 100%;
- position: relative; }
-
-.search-btn {
- -webkit-appearance: none;
- -moz-appearance: none;
- appearance: none;
- background: transparent;
- border: none;
- width: auto;
- height: auto;
- padding: 0;
- margin: 0;
- position: absolute;
- right: 10px;
- top: 4px; }
-
-.rst-footer-buttons {
- margin-top: 50px; }
-
-footer > hr {
- margin-bottom: 14px; }
-
-.footer-info {
- color: #8392a4;
- display: flex;
- flex-direction: column; }
- @media (min-width: 768px) {
- .footer-info {
- flex-direction: row;
- justify-content: space-between; } }
- .footer-info p {
- font-size: 12px; }
- @media (min-width: 768px) {
- .footer-info p {
- margin-bottom: 0px; } }
- .footer-info a {
- color: #8392a4 !important;
- text-decoration: underline; }
-
-.divio-cloud {
- margin: 20px 20px 10px;
- border: 1px solid #e0e0e0;
- background: white;
- border-radius: 5px;
- padding: 20px 16px; }
- @media (max-width: 767px) {
- .divio-cloud {
- display: none; } }
-
-.divio-cloud-caption {
- font-size: 12px;
- text-transform: uppercase;
- line-height: 16px;
- margin-bottom: 16px;
- color: rgba(0, 0, 0, 0.5);
- display: block; }
-
-.divio-cloud-heading {
- font-size: 22px;
- font-weight: normal;
- line-height: 26px;
- padding-right: 10px;
- margin-bottom: 18px; }
-
-.divio-cloud-features {
- padding-left: 15px; }
-
-.divio-cloud-features li {
- margin-bottom: 8px;
- font-size: 14px;
- line-height: 18px;
- position: relative; }
-
-.divio-cloud-features li:before {
- content: "";
- width: 7px;
- height: 7px;
- background: #0bf;
- border-radius: 7px;
- display: block;
- position: absolute;
- left: -14px;
- top: 5px; }
-
-.divio-cloud-btn {
- display: block;
- text-align: center;
- height: 36px;
- line-height: 36px;
- color: white;
- background: #0bf;
- padding: 0 !important;
- border-radius: 5px;
- margin-top: 20px;
- font-size: 14px; }
- .divio-cloud-btn:visited {
- background-color: #0bf !important; }
-
-.wy-menu.wy-menu-vertical ~ div {
- display: none !important; }
-
-.rst-content h1 code,
-.rst-content h2 code,
-.rst-content h3 code,
-.rst-content h4 code,
-.rst-content h5 code,
-.rst-content h6 code {
- border: none;
- padding-left: 0px;
- padding-right: 0px; }
-
-.tabs__nav {
- display: block; }
-
-.tabs__link {
- display: inline-block;
- padding: 10px 15px;
- border-bottom: 2px solid transparent;
- color: black !important; }
- .tabs__link:hover, .tabs__link:visited, .tabs__link:active {
- color: black !important; }
-
-.tabs__link--active {
- color: #0bf !important;
- font-weight: bold;
- border-bottom: 2px solid #0bf !important; }
- .tabs__link--active:hover, .tabs__link--active:visited, .tabs__link--active:active {
- color: #0bf !important; }
-
-.tabs__content {
- padding-top: 20px; }
-
-.tabs-pane {
- display: none; }
-
-.rst-content .sidebar {
- background: #f5f7f9;
-}
-
-.rst-content .sidebar .sidebar-title {
- background: #057eb6;
- color: #fff;
-}
-
-.rst-content .sidebar .sidebar-subtitle {
- font-weight: bold;
-}
\ No newline at end of file
diff --git a/docs/_static/images/logo-transparent.png b/docs/_static/images/logo-transparent.png
index b377336841d5aff5ea43ac043b9010424b93af83..9400f543cee15409d308a4219c211b64b48adb9b 100644
GIT binary patch
literal 179037
zcmbrm2{@H&_Xqq)A#)KjONC_Ulp%9UoD@n$88)R<%9JT{A(ae`D#H#*k|8qBl(7t%
zr@igaCd1ywjg9@SXLru~zUTYDr}KW-*L9t9u5*t2zSq5m-}5rlse`WHjP&RJ~)5l4PMbl{{{?8Jc2
zy9+&5VvFt`#l@kjCa0bwt#_VhY~|W4z4z_I!!I`ZpW3f6!TrK9>bI|#)<*B$W|vSb
z_}zpf3eN%=9|
zgK-I?or%dNE^$}w-O(vmERagO>c3W(BSd0=9~R+eWke8@`e6%90nadQwYh%8hGNG;
z7$ul35ojKE^Rc&RJJV;dZ)Ip_EIf>V1pLNNqOR-?j>qH+%WJDs$1Cp``;JOy3f3MaC
zBvxdlFO&OisaPS_`p2*)iw`BhFOeMgVN*71NGPVfH{S4iTn+Pq>AezIk8Rp90XoDz
za9ZAG^uFvLhe>75Zt@eFg)L)3RjJh$d0TItJ|oL3>c3j#bGo@`U)P0|iK}k&Zo%$r
zHzk|5$HV<}8iT7)jlJ-t5-t&gnrAfqx<~phn#k}P^-1z(6p@TB#S)nlrYr45wI2nL
zM`w2|z)>yh5)@%=^-laA{*~{c17%dnc(FV2+j%pri+}XK^$D)JYrntdGZV96T(`c3wLNGTQ@gTV
z2~J?7OSkC}Z7gIcCyOFJr>VtpYu5Qy`*j{xwh-z&jHmI;nY*Fs_=Tmyl*xmtH%CQeGYR8ayk?&KJf>$Z_SVP}Ev^!Ct
z=}VEQz70GC)`cc>t4eV5u#WQ@y|r?`)}e`-DxavOIlhx`Gx&HoJq(k#+^&G2>gdWn
zwf91#NrCmG1YQ2!3C;(Vf66eh*%SeAA1Hm(Y
zg)7y6K(pR{V~EAu>5rBMD@Peq-~e@9)I)dIl~F&bOOWFf7e5jlyS3J
z34$WjYf2>gAPfI%2^UfM@8zy@Jxq?U_Won^8XlGl0}AmP6HJ_hlY};I*8Jj(@ElVL
zOUQVK=m~lpZf2#J41GOd5KC)lW#j0F^8_}bCYpFQ;qzKDK}>7KakkAysN2kFJusvv!Pju;MttOMbJGY;JHJn-58_&@NT>34(O8)vgKdf=Czqs
zs?)*C%0kI>;V=4+%&JK~!#usTlB5+1y2uMhHNtd>Z#^lr!d{m}iq%P)mycD&syO^V
zb^oBNz)kUDFOeIB^Pr2UV~3}GtFLZmZjwAe)_}VdjP+joH{fn&C7FabUdJBTEe^N3
ziTbWiZSl
zAM5mT7lbc@dWhR@DmJr!fWJuLB3yz&_1zhH*0tDMMR`|VPi$iC
zO4EF`>`vK^-NAc0>8W8`peh$}CQ`&N@BLc@$*QIwdSHRkjPur@KC9=(kdP{w|LfW2
z;WhDJhzD7SDUz=@0A@rTMYSMgM;mOYAG$lKha&oVA)NTd2x|9|WAte-M7R@+yL*4-M$fd%JH+K#S$KEa$R
z%(1kO1xI4h%AK(ct{T}Gl9uU^cCkS;!MXI$d<8~U^HGkzIF#HhqyS0b6MEf=k_qON
zp#-rw4<_P+=@wnW%zq8cEWuKZm=x0bk4qsd=WFjr!>RjGfcH4D5advEFf^jt6m{aT
z7Y7SMA?5u`$IbSSn!}^DE|x%2RaZdqy<4e=%X+^6*J?{|zC3S6x%~UrCgv%B#mHw4
zpAES}h~?d43Vx2-+ncmKK6_n;!=8e516v#YM^PtIzeTAY{5IWkK)S!~%v~kG1;VS|
zQ(dPABq0L1ru+AFJ?lCutwFjQ6*{|A8)957)wesR<~zE}HGBZVNQ0Km@5D04rC#Ia++H`bCCKg0rXnj!xdcq#H7Kv&i-E
z8VDs*6Gy5CJ7L_>!VZ#~PkTH5c9L+QjPVdHUQj(3NVcM@Q}{K=+WUgBqTn
zmA4u!^;4qGzC(*L}7p=4sKbK29wNzrV5OvafP3ZS}1WsuUDuv)r>)tZsPmt=InG
z>>bleU~@R)4!8
z&5)|g<9~-#SLT+kp4}M2F;dw2wORS}7N9uWAQo&CvuiVivEiXOuAoR^oS}68)GxJQq)WJA{JF|DDH)8l
z8#PvGW0Rx6COKY5&$z$IY^Zk}BPNg!Od3iA=PYfYuN9@Qt#2~YKSZLrUy)sBmEe2>
z2!`*NB|hFABmwdEpKWhz^=y
zTY@1|j!tBLYG&hO)$c^_F4F!N-boLFbhZ#KjXmSSW!>#h5M(eA^*_6wm`q`;zr3}m
zsLVSF-iUYq6^bGDEQDCayuA51F_+$L1AH(>g}(rcCeo3AkbM~(IJw7s
zY&P5sJb;M8qqLmQY)4yL(;&Ce6Ct00pI1ptOFf%Crra#Jw)dpV6pKz)Umey|{V
zq6+4`?cd4UQ*W{u=nPiPMOCA|i>hx1DYhwCL!)SIZ%{~8db_##-|5LijaN%t7z~3t
zwN35kszH7G0FmFh`k)Bt`WufDN+1gp07Xner-35xH>2`-TKv6yn^-d?;D~M)`p-wL
zCXLxMVJ%pKxL87p)8A(qL6ti7@;mpG1n*SlA5}-k=xh&5h$N~6^&{YOisy0^7q1fR
znV1utiAsEb=j%cV&KtpEfg)%nD3CBAhSB7i;jl#kBLA1GHqci|psOBS7VvG^+V5>H
z4T<~jW#cJ1&&R2aZ=D1*dWPz5Jzv!zqb|X=^Y4`nhL;cJZknX+0Cfn60h(6uHLgu}
zw87=t8;ZGExpHc%)KHEQ^nW=9B&OfNlNEJQEy_J!eRM`3HghAY#LzRp(!I*?%+*(O
zg%P+-z+zF7ef7Zv)>2bcS2dt}Yrg}_u3uhL$m22}CzdpE2tfp;Kt0{-aumG+j?PK4
z|8#t7q45MQpksxnHhyw!CxV<5N6q_eLe8xD
z^I<)0XRo~pWUni`{}%dm2@%bHwB5q~WzkgN>Gx10i)ZIsYKrKpmUE*{ADQwib4?NY
z8zq9TQv)^vNxJZWI8)rbW;z0v15B5Y_YVH-<;`i!7#le5D9~q%5aN!1ou_lUT
zZsODjX({%$VmL8DnAfzK&B2NCDczYe}3)H;WAQ6=c%>
z;Cj!bAi?MAnP0h~2KhQg+iCIY+THWS;-cRW1ogwyc204Aj?OHRJ2n(u=wc1Y&-|4o
zaSyNi;lKc)D86zu;?yXGl@fbe($Q
z=V8#oK@g=2PBH$D;#8Rc#YqMi2ZZtFqzEsC#ggChi}+
z?s&XU^n;FJ%8}A5y`D
z`gYLqR!=RJ#e@)|1g|_L(L~kd?})=DRuK4${FkEJ3Y-g2IBGU%_#!r47vStxm}|)1
zge0FOw;;8;vM?-!EmpiiE*vUzK#(vuh-GzI>U#$%xZdo_A;R7k8OtfXI3PA|P%hH5
zd4lO5C33PUT1S<2@HjwgEAv;Hv-tEw
z1P}Ds-@mR4f)sVt?v1EKT5LgP+A8Ky8Xo)wS)}W5S0{e@D}ePklS~ezWha2OqIXXu
z#%JGfn@g*HXPFrksT>K;tAC#?(fi!NPv1SG`py9akO4#5e8?6n_i*hL@!cgVy)mTi
z%0J)aySg=}gvJZd$VybkpOPE_N%~0a{uAU<*afIlOy(~G?}p=+5z^uuT&G9EQV4l@
z43*_1DvLMBa*p8lzwXj8Re)FWRSuZxE-d?(URXgI)Y$vM*cpD>x`Yqzx`gizze>ZX
z-}H)TYmW>1zs_`EL_WyOx-kS^QyJ&L5!g8(Xm8%tG*GA=DA#pnu?b<<)JIg0;U-3aE4+uGe}7(}TyS
zEFDz?N-5ZJ@6^4t+5In#WPmEwl}`M#ul5$xZ|Sgf6z(hgMA2=Az{)msvF^fxgOocnBZ62MbV2NS
zsf92E;w}&~;y7-2y(5dFyN;vH4kN-xZ;qcm!?=GBImVz*+H9X>>h+j-x|Jk?jd$nJ
z?i-gwkf%U7195i5*8`Qrlz?n4CGsZLl!4R-g1Us|Uyvqr8=8m$DPXtJ+VK19rTF!T
z&0(gln(xXAh75)1)4;#&p;gQz+$L`j9?S!o_J0!zEfXpuPOwK
zFY6K}f1O7moU4n5M$Rbu+9I9|h==+27%s$Zhfi&8=s}j|ck3a^AwzE4gUQoq0L5r5
z_8-TeM_wwRI_$haD)4aY6j?DD63J8xX`5;{-~h#Sjq!HM{H&Z){uoOdrl7qBp}49f
zjr@BA-1G?+0%4{8d^7u?Jd8(BNBXb}j%>@B8VuEdHm6OfCb#ZlKk91B5(rZC8hqr^
zv4sc+YI)Cex2|8@CCLztKi>jWbvUGSw#aCF2Cn++{PM|T#e)cqIYEC9`eo5($0bYH
zwiX6bFuFK>g)Jnl%P3ufSM=g9Nh9dKJ03q(qQQ1sfqsX&tzsO=zFpLM2m6adiDrz@
z7eF|LH%j>W&J12BN^xme8#y>v%6_1d>&6Q>_sPF>=k12*2%&J5Yo~qV
z8cjt`)Zz^!-hKzQ5TT;lryb_z{{T*Tw0w2C!_z%X60LwAey-8Out>z7?NI>!@Zx`9
z*r=n{I(b6wiLJ+Kc+maqaK+*!5$2q4`M}arkLs8`vok+pV+f=v@IS=an~AvVLy6QM|DOzee8t0l?wGIEa6awyngeE~aB2!la&@2#S)Fc{
zZsvmga=b?GFgYGqI1#6=)*2%ItX2%&xiKG7hGaNLMx^D~;_#5gtYF&Hk
z=l>XPMX;+7OKlZl)#a-#?fPwP9C-wuuqbK?BHdiU#zOqizHdp^m;ND*qC@INv6)15e
zJ(4J&zXPtG1x-FAOEMe#A|gzSU)Spu9?2tI|6l0{`&7gQMuNF&iL@w-!esIFnsT;B
z^(vFA%b2CE9WZ*ke;-^hi!mEp>|hl>AB@*30}vmd~IOw+M_69d7R%WZAp8v
ziLt^F8dDW@7qs1>W+XCW$)$i6qgA&R3lzEKIUm9d?8r^>TLo6znVg<38iq6{i6F(b
z05AgYCsH!791=me7rh0pCrKrGf=JpKTlo4x2|B|d%tdr
zqSU8L{V7NAc>+1{#X#i-`=bG?x-3iM3tCGTwCn{Ck3dk0;%`01814PkmBCP7Cp18#
z^w(aR_7;q`n2VSA)(|qwrGv?t*_kwadGaIu^~fCB0hN5XIyW~Pq_DhV1AvtAVXu6V
zDtD7CQ`H9Q)Y`H={Is_Jo)ha40S3S?o9)@yc-qRKdwIEyuiLakjebgpqE^wG+E7w>
zB)vMb^>zFCo~Uv0k`na!YxGg7s#bbF5|Uffo%Hb707`TK@6JjhJ7a|bpsP*eF>P(t
z-2wj4ojeP*4EonU)H9>Q5f`clM{=hdl~m-5hL@Ha_!WZdKcYLbpxM;oGZE!%Zs|)p
z1#vE|LQNQs;Oe>Sw`W%{7XnuCOR=_yz*YdptEm$MUB*=U#pO?J<+sw}j^GWC-DDxO
zKVT_&p6b7oI!;@&dQc60(eu9hkgEys^I{wlC8fQpr-qWkFnhJs(r-_<&UgKjKQxG#
z76%RaP(}DlZ6`&r8y`&w`89z6+Hnvl7qUGu+S(k!V6jmhkawKIdEU}3)#$rfs4vMg
zTg7rYb6Ye#mk2bnMY=cl6laQAZRJmg%nvmOoumD__}J%Nn|;%nZJ<)t{$uCirc9hj
z$?=@kj
zev62GFGG%357<1Tz$)Rp-KRR4bhm^7v8jM?S9Au!u+QCD#HLHv!P}xfqsTC8&Y+Zl
zfmLtPHA)G=SX-b_98ol>(Pyj(yg(}RR}Lnd6F9;nm9=^oN6~vUEnXT*qL1wb#0Gx8
zo%>DeEL!4Ex1#Lttfw%wU^JJlGhX{HiHez-z;#WaNE&7dWdJwWlf7kPHmDiOsJl3{
zzs~nR@L+?Yq1(r-?uW_y<+tk;TpcVGX5pWrQY7%y?|sIl-?;9vF(L3-}$AIIPz{$l|13^1Cc{)7hCIb?UNZmo=%qg4m?A8Xxnf_4`0HW@4Vs
zW~>^ueJ*t6PWPm2RBBu+NXDLWwT*5rL*FInH>l)?Q(sT~zH0nEp`HF}?PGOO$$1{7
zWi@pEzN5=w5TD5*rhadh^J=xz4#$%7OUE0&xG^hCrYp%8jW&Lm8Z^0P>Gq-Nvo@D4
zsj|=5Zwaf%%7AF1Rhnia?)vJAE?u>TZ{3(xHn9-z-X!0i6=DhIsnv5Db>+-tm!QV^
z;K(VNcX$~Q@0V~n?}w$7WoY=bNd%k9szALs9p?t+4|I$I;(kpuq)W~}(JAOJJ2I4*
zbR=GkN-8;B6qHsHxzx2&A(*>tdBHSuF^J<6Ltw-xB!5vmQmMc1!ct`#TFW9FXQztd
za=y+Bj(3(iXtO)+B
zH!cOWQ82fW*8=QxZI{?nL4GTfy;|}`i>u!WYzNCI@jlixDLj7<6lCbDACU%}&nn?e
zD0x$}2=Lrvr)u<6wy-pJWb(4`cii1jHKP7f>eQ{_c6nl={R^VDYjf9Xy#lIN5%TjY
zq~*axX%`$Picw7;#Z({j<+Pp1dA)1WfogrQz?2cI-Zkgrn0PqG6XXBB>u1YTjyCbc
zs%sA3g#7>XfWtKH(`P-MFF{Dk#^QB6o!@B47Zn&0dPn!h(Y$AR6rcN6Po|WuG;agO
z`xMnPC#q-v9dtr(`-Qn`x|u7&y)nbWVrGSD9bwp|H6c~UYpf@@G?R4+N5ai6YcCl-
z_=e~h0^QUJ%cfXq3(h#p2b1f1P2E_zhK-=EyaMG=xzR82G5NE6{#1l)Qux8)@!}(%
z%$aoj*IvN8)DKh&Exf+n`y_s>%Ex!l(W#{$eYA;2+J#|7XI#H{aNnOrJ2e%Yp;c~T{BWQNH|7JW>
znqM;EK>HFZ8O?wQD%K~_W|)zN2At>WQ5M4VlaW)e?>^~yWU0Oo!!H;bxuy;uZcs+}eM;#?W7kZxb&vZo>=W~xiAam(GduV_mL}*g-(LA%?<;U>h;OIa
zi5{W9*F2)}u2Eh2cv9wdc&}mYS_*G&ddcsTR7pCd@+anr*Of8(c{deRuPzh&UT1w6
zZJeoemmH`}sG`2vcdK}j*8QYpTpx0TQ}JqhH0{i=E26UvvTaM?gkP7Kl@OgZM!DLK
zeQ&Mt+{#~Ck};{Hx*$4B^T?)rwV%4Nid{**wE@wAr)bC<8S3+K;A>74VX$*A_r}f4
zn)2R-M#J}y**cSk62qp7d}{_UeBogcSZqMDjR4XM^=_cx#hYZwcnMUk4Ia#)U#;?~
zw$3i%mXZc?Tc4gDvxD@Q7WyvDJ<%aaf_mFWVMA3b^-wdO+E70k(~O(V4OqeGD>EQl
z(0-ah)aqwojlp3O!8~mPr~I)mKRaBY&0D5=Oc=jO>AaeCIMegq>dvD)bngCtL0!r3
z^tmtD+KyxPPb|Afa!83V7Jo4F<=Rl0jL!+$qZ`unr`)pBS@NXrL*Zpj=v%D%dBH`B
zLv?M$w-Uy1Bvl0%^-5t7Km)2gqH9nS!`7A~8|1xRp
z_lcx`4+ais<#g}#I$Q8-!4LfpitP4*uGVbYS~4tO%FbL>!Sl(iM;@at
z&6)Q=5
z=)TK!Ly6JN0$&S8M`l&hWzVVyyR`1OHN0xG=&&{_`QRWD$bz!PsMU%2z36QMmEAPu
z)m|~D20=BdxZpih@EupHr{9Ff7gJO=R3lMnQ*Kj^&Y}yNFm&3M1^DB*U!7jOIxs29
zZ{@h%Lux4TJjq3not6N;vg^fgE3FOc4hMPevZN|8n~c@7Jpr0#YAVW|yK^rFDS4Ng
zPN-aY@dmBk!<9*8E9WvP-PzExI=j{*TbwbLv)SS4s*9m^+HjRT){h@vrAIz?aulK*
z>v^<{{d_R>k538QZ`AI*GUq;pz7Ndkswl?@X?gGxMJ`SB@4D$+=ofmE+`9v%)$1P2
zo~M23I09Ac(v~%@6O03u9aCpVRb3t7uYS{Kuv{1L;SiuaDUajt4rveYBr%dkPa}7DMu6$1hh$`^^(+F?AP_
z?i&!+&+7e+=$`iOd#zq5C?PqcO{+#PpZjTqhvnQ#rdj8jzQn}pYp~J1V>64P^Nbav
z7qmTB;hDiX7Q&$?H0x{gx!E-D47ZeT+N%R~KB*Q-UFx6wc~b7uYsE{mQ;iA>-t>87
z#t61Fd;Eo#?E8y7V)=79N8~4FdOVNA`yzv-?|Clx$C|0MdX2~Qr=L%}%%eT8CLYER
zBMNIuhP^pfsJVpm`YyPeM~F|k3>$C1ORgHMX8QU#qfIF@eSOvYOd_U;de#+EELm%p
zX4^L!&uK083|Idre(wd}W5YFuAJsqM&dw~Hw~+EzIKhi}JcLTZqxUD@I@rKe)o6uP
zVkG3!>jTVXgMsdJZ_32$8olUA*NnNNF)YDpu^110Q7)V`rJ1(3ezsh--f3KbDG6)e^p3
zuKEuiHG2HaRma@p93kasavO1;IT%7YE@P)K6`0XcA6uj#Q-IlbH@=+04`_k)?J)~j
z>ywOL7Z?X)TN|RCwgNr=J2|1+Pzxxo)l+dL!1&hwX~7wK#{S2G(;NiL~eYG|()L{1?ng|58ZbmcUzT|WI->bUDIUFqARx&JrsVj0DP1;3x
z&TzV0y18BRAOAvG%hg`&sJp;m;|9jQe_~;ggRZq}LhCA9;qqPXXSM9Cw91JhDRV~*$PxUm0=}D!
z_-x?{vVaoYE1J6sm+n+XaW7YZ4W!Tww*1`Sg!*seP#d(mpVc^0*i>n
z;`yWT7pBfGtQdS@($X%7c4>VtenyeKq7FJ5-QM&hQFKCou5*mjXiRN@dnV&&n4X>B
zoC>gXt6VZv2tRK*H`CjW{p3!0-e-$X0r(My4?eL^`XxGaV89U%5&5F4hHCFnNdRzL3G
zyu?-FU0OLplQ|;79A7`$4_8f>^LMOA8!}G20wX7V#$rf*10=uQuH48~kpxA=f*zbE
z`siW-OJSB2k2pt`8*I?R@VsVc=>{;P21V=03I}yrls|5>UDw76uCgX4zqZm8ZCMau
zA1cC6vx}rkdJBzmTJM+4Q{KEiV>n?)_1B%XdeIz)K3n+*4&`6fjVh`ctkvU>+q3Fw
z3>v(I2HGyV22YMuf9|ZLnf{`8>HalUiE4?GuAd3*3-B
zo~~lO?dE;mXL@wR`x^>I-5tUr?)dpT8`@q@c4-}9tiRZ#P^^%bw^;ZI>ezzjD4;M?
znmyFn97@>~rF^gJfuAp@gPxXR(4aT5>OtgBvaH@#gaV*zY#|~n!?~-bMqVLP*6jLh
z!Dzd7tjWoo!-)0rQf4{Dj`Ur9N^?~KC$OIzVS=%3g$*4CPP~d3z4lVwYvg`+?~Lu3
zBiQ}hin~wOWY{|k>b%vL`(e1Oyt_!r`>Bal%F5UMm5N9oaFz;j`s%lD+aADmu7VDI
zR5lrOjy*kgBl1&odXG!19P{7<&bccA&NhyC92Ui^UEs1zQa-eBAg8l*t7Su+o=?EK
zw1zn7??-8J&(xqh^#sjX_xKk<2Cp$p6w83m$)ZTXPDhF@
z08Nz
z?1JXrn%2_0^NY;^W4Nx>HQB~3>jS3%5Js&Q*yyU6`F~&UMZbGr2OybFb_4Ws`6k0+
zkEg1u$`v?iB9Uu1gT8~e6jfui3jHRr13t?+nY%E;ND;h2wPBHGK>=U6r={l-;K{(#
z=FGPAv)L9NGVG3ymTp&~EZq*T6wU)!Rd3x=loKP$FOoVtQ;++S0t}cPKK%`}zEJKw
z1P=uslb|Q1^BL|s?b0f-$FARb?TDxI*{;jEQA+;_&Qvk}I5IQ$O!q8)j+y!1RJLzd
zxF-*QmH}wYdoNqC*few~kUPIm*1c;J=|9wU*~R7!ZIOW#N1`|0H~B6myeI{;O2agNxA|*J@fo$WG(<-
zNDtp|tC7l=#_0)JORG;w-g|YHr?^IcLd_YvTO3`=
z`f#Td^0O392-{b_=4bu4lR58)HAUkNo*zmy
zU02U>;0n(`cKZD=YwKdd4S?ym@Gzr|!LlorA=w1qG78q%5wnS%u?3yc#nt=6Btv`Z
zF2E6Q>emN7K%LD3vW617;tD@_4rH3~A70nfz1IqtZ=MTXm7dwgDT{pNh57zcp{12U
z0JgeWV^RK6S1e!k(q%>8Nn!(r9eDu{>|XNFJ@46!?d2z}WkXx}<7Td&<}>T&|HB?0
zX+YKgyz0`0)BxiS7UFDl*>qW#*6N``wsku2jvo0SPyHkWtt%orVh?l&s&}d=D2$Qt
zy2vQ3`s)#^=wkpWAM)JtuUrxDIgtO(rIn2deDA_!|JF8}#YBj8YY5Lv9lN>91awuu
z!B`F1o%Z#?nLiuu7El2Uxc9Udj1D(79c%t@MEs0b>&2oA;bB@I0EEM?Op*G0#)L@f
z^Tkf`e)ja{Z$
z^Wa~r{g#FKMNVS_#(WfZkK0Peepk$=R=<+wBES}2pMQ-BH_{&bpMO}Sqk(UE$mYIAkB23pGq|}Bd8ZrlRAuiSa@omEYh3y*t;DYl?EBBMp3M$3kWeq4A8XpA@2z%S^F1
zP7QBES_DAqYxLmR{f3U>v4U6SNER~3CG2*y2V0Du<``dB*;jVvYwsj#Z7K@23Xk@dbRt*H1dC-d(r+#s=PwPn{+Ael?r~TBoJG$H3}q
zBZ`(@ON9fgBUvanmRX}-8-Rv*@yJqR^`ww7&hy7!jPQN|~U{^oU&q!(M+k?QAZ5z
z9k_5FxG+4Js0iOH5uESy5Wp_su>M6CuMTL_G1c%peJL1CRtN<&Dm2nym*)&FW*t0F
zRc21K-42{L73KtHeKh}mf4%PFrPewzKt3q*
z%d^bd@@g`4%uHx@5UqyXwXhVu5BT+`o6Zxf3T4VbBf4;u7kKaATqO_#NT%Xv7H-nR
zRPz9YDaXyhOHjj&tAR*2FpMIphBd#zj@%i0Zk0Z8)^fL;Ylt2vp=89VfTV==H-jFT
z2`bcfXk9>sdq)ann#cp{TXhM44d&Bt-TVakMfPipGC@Xk>uQhk8R&JT?xGV^GuWtj
zX_8X9@C_enxf$67>Nh#FP${Gj(+d=WY-*D|(A!Ol@W=77n2oB5K(xEtKP~kfzJFw(
z(vOM!z)!aMn#+Yu;H5u4cJh%Nu2{OwQT?dBkpq<8umD-tGl=vtaKjxxtFT}xW)Lrf
zg&Saz&ivGJ0674pWp|+Z9>UU;6R>4po6caTQ_&7T&67l)hD?8Y`5uLkTCqdRm28|
zX`O5@Ox&!bk`T)vbZOOg7^KgxgiU3aWCl5xC4p@+M-;IAWy?a8xMlEk7>{+scfLB8
zdzhUjBJYE@gD-lRUD3bzj?v9jAa?WB+_rQzI<>K59?0g{Kzf
zG}B|ifZTzpc~&a~Xa6H02Og&M#CgSXu88`0z)U{MaBXD}{Grsn|RhPb7W8eE|pUz%J{&l1AR`fShImepL$(uMV>4Yrv0fqf6pUp)S&VT{n_a<4VyaAn(kzIQX
zj7SsrH=O%)GuEY5Wc{E}d;lSA54;{JJvl@CzR)37O3LLIix6Iiu)%)3J@CxLOAV0T
z@%KqwSZE~#uzKa`F+W*DV(|`BC=ARk)S~3X>YySzfZ)1kv6!^vE#IG?KRW
z;R*RKs|ok&(`Cggl{DghAwYT8`a!nF8ddPs&zx0DPz^?}_B;znt<;W^
zdOESSz>6m${;CVypZPSiuhsx0a}G0ZLzh10h3ZDL$Dub-Y-h<;+NsR})fk4^!S`aC(2h
z4NMM$Zw|e3G>h*Wz<2uzAk?zjwSLM0=E1E$qivFnsKJQzaAzyaw+HmXBO-om)
z>kcScC43n4A42KH6H?!|1~eOHn66MPR)AEIw5M^h3QF^urLqg5YCH2YS#L9#Ll>?4
zxEdWWkJA-VkkEV}JpR2*)DJ&Px8rf&bTUp$U5^|}tnSq*c+)gvcx*j+3w`$i%+3IR
zXM40$ZvYrFbKzW6@yrPxe4F01cDqYT({NsdEy4%vz#|GRue(J+lUd0@b67*S%xz!J39iIY73{^*!PJwPzG5LfFG~4lQcaMKX
zK7->;2+)IC{X{Y4Tml}-@l2}s>QWFCH=7L>1|AfBEExSr!JkwuuZA=<{&_NnsoL1j
zX?Ptn0W*LK&Yu`-dr(~U1(f$yKz*@&YVt+n5$!uxBf@abhv$gkO?}YFA~=ExpYGlZ
z#Yx9hi84TTI`}>WtvbE$I&quB_L#2yWk@w|1g`2WC~3&6og7;}I;kd71!aCy6ZF=M
z77O8}Ye2P_uyje|j8)9WLn>>2dsDigJfU3vZvP(o4d!*wse*C@-a}!^Tu!2O>wV1Z
zfp6axRd;k>k><8IqlTAz-s@g0!+n21-NTykrrjfvRF3uJJ`k+1>6>gRm^zim!eWK;
z1;#7+NCK7Zxwxp03lbbys`9S5|`TAOPG04R%$Tdkf_+_{^apvyT!
zifQ-BEU_H%IjhAjuT_4e&T6GM&SudRzqtpgU;!UGeB`G_da`yt=A=*UMv|8t>m@+9
z8@G?IPfb0rFjJJREp%Div>J5*_}sS%W!eM14=L90TN8#7c^C=RIgX`^5mf8dRMQ9H
z@cazqE}@s-fzwAgG{@IhU7s%ob}Ni*0bQbrCp9koy}jViwA+kmWh?wU>z1`biLjw6
zL`l*eT4<%&USr9ncQRO!+RHLDp*1ynVVicJ}O?J%z%edq*&Ou?I|;+KG4QJx2+-X)DC
zLe(6-Xn!CcH7ksLZS2s|WLYL^+Q%FYx?|#d)|M($D^Hu=GNhK+U?1O8&A6^EEuYim|bTSpG)%eJ?cN
zH^${b-`bpCbW6HiS-wka%oKFQBPaUED{PqiTOqKjAR|#MCP*<)Z?$(1pyU2xht_DQ
zPx!2NEj8aNwW!UpQ1I7qWkD?9QM}^1o=~TFUX)i2J?bsx=L(8}>PE2FOn99;z4SBp
zbs9>H@xlRhb-$uP3wfv+I-3kNR0l;&Q_>^)#epA1T`v|_1~1i>5>`g7l-q$LH&8K4
z!`k7y*V%-@Rr30wCLW~%)4p$bg<`Go#|yL=!R-npS!;Euq$RxJ4B7QnJ4mM&h!;Hp
zRaRFzShCrsPNW*i!vBdapE=QsH+^K$g>$B*X)VFPlA-&Au5F;)Q@_RUKx5!n$!)^x
zgQA3gxRI2&S@OZbmmC{!non;H66B9l;Wc>eF|53$zxi$?AQ(0ddPUnvGTDv;Wi{w)
zfTCdsm~&b!*q>iZUZ(ASRx>pq!3ExEk_6#hj2OBSn0#zI`kuuoL`m=kP
zrBvY3dSkjYX57xGl!BIqDv@DRYG97AmHvGIKTDtK6~q_n3rN1kQAJJA25u%z8^bqk+E|1;l&y{
zkNRVObzr>C{)I-<&5vDLw@kfQVU~aD@Nq5vg
z0wD(d1v1m6VC_82c5!%uQIjL1kp+$8S1k^Ob`#XRF&rT0byVC#NV%g^rPb4Qgs&89
z^UhPMm?@GFjEn3)HetL{>jU_9`#bGi!tq1xxRgt^o*h%T%e%qG5IyK99K;pMjhmuE
zTwEjb$ACnuH~1`+!RpV>7)sby}-?m!^D?k@#t4
zZef@J!mLOBZO+OGo&<{!#CL@1L5Z%%idmjt?;}a@g?m3y;31XHoG?A`!zu=b1FJsZ
zbaF6A>!^3MS`s4#bV!wJ7qJH(fCn>IBO&k^@CxX;*I51T-HP}Y_zD^?zhl*fx$K&F?OA)mDL*b{S%KPA@%w3#^|^MEeMZ#y==-t&mY_f7mIMl3!CX
z^X;jlt_Twh)3P>XNA_#*>yiVCz~SVc3N9a7_goI>vH*+9mOC719hAvt?Jzp~B0S7d
zJQjnd&k`qYASh)j7U+0r@QR>aq&yus)os$4
zRU8fJ^GID>oj6#H6b$d+FrM$NRLW$
z8hnFp#F&i~u15vv<;I)jJe0R!FWzc5814LO{~=>amU2Bu6Mo1I?*R?VB_GLE?1nTV
ze`)*4=<5?cJjKn+<@lZXPHX(MN;A#rvP)YLCin$|iY9OzR11*1gs15z8ebXZ*=VPH
zyHbXpSf+bDpg`*C;bpm9*IVojwT=#gQF~{xKQYxuW%hcG6*N0(E8#HT>|QLDp)}tb
zT>{!kYreHeWHw_fl*-RonFQz@#|swnB8<=k7svol2qtlX8vPoWwvR9tq7a}O03A*K
z5S{lnDMq78w6Cv`BD=ailYbr6MJwD&L|=MD)BM8*a!BIy8An7$Tce669L90_6Pk~y
z_6L^WbG#eIVYGt_Uq|xsw6`g1S3c}Rg3$E*5}Y5N0Ba2U08CtsZkMkxFz+0tx0F3d
zURfPbyg1Jvn6CLQs*Put-9GrXY33kK&Mm^^9
z_oEdA({Yqq-E1gRB*ozalF!_q+#N`pw5s+4s6oSEX`9~M@q7Mk$K`vQl_c`d7Dusc
z|8lv;we>*jvCY5jWhI}(Z5%QgnXFa0=U!E`Cps)+q%B_+n`8OmlpR)iWb=vq(btx4
z+w*rPe_n0gc2^l|I$yXo9Y@tH7WHgXxFZ1%nVq@e6X}a#^*+H0-(t~2;gV6?qArCz
zx%k4Oe!X>JO3d%%u%iLyWGZor&}Aj5@OrDYu|D;=sD>Yjy^6a<~x!0`BdR(
z#UfRjUu1<{p;0rbO^S2YA2#pVa2$TK1k<1ujrWp?Y`nohRP!+lzcM
ztO_qBo4?q`;z)FDRWiMY$c_y74}88>cu8T+GCGX&KGwDML?5rjjTwJJ8-Ms|(#R_8
zwU;$f5viyki`vrneI|Bd{pR$nhvxEXjTy!X*G_i#CeFIH+F$2dXq50?lIK}}$sKp1
zPg(b^_*ITG^1U6koAw>~4AQ?8kKM+1R?h?fIAG;>^_EY2HcrdF{v^XMa?b5G1>RkM
z-gO5~tFHDzCtU#doCcc=KAby~;o5r4FJ@WqoODP!u0hXX!-1U9k(kHpu9lwf(7#Pl
zs{N#Ppj)uW(#=`;y}SCrKEjKKuHz+^A9$MdwjWNn>e9WkWtMYpccRBFtbBuB7iao6
zuj{x^&w0u}j8Erl=w8`~Ageh+`%|-+4rj1=fa9|tn(oY{u>3!aeRn+7?f?I&
zlp_@)DtlyvBgtMNDkz>>YIkJ5wfGyDdD{<
z@BR8$I~rFc1RsKNgG)psoJ@|7YJ9d2*i_(U!yKG1gQKtYYI4)9Ah|ZRVYME1G_5vS
z#+7pQsUdE?8PH$yvKC|KbYLTF&`>=14Ib*ej&Kv0gd*jIHdMXYUZHIHlH|lO-xu)p
zTcku&fzUXcWG+)SCZqrUJMQqloDWIJaqDd$x16l*b?aS3t9~)M=uwoFgtPj_SimFy
zBtn1C)SVhnw#50^(}O?Q?NHmP*rP*FQ5kJEUn@9gA~Y+JZ}b_bL@;1!f1k1uKwEj8
zE@M*2IQoi{ev#k8h)GP-_R=*_9(#Y@5MBA~xplPzi00U-&dTw4Z
zbxyk3zMk2eA~3=2O#*p_xD6&8=VPaeJL#O_qX4o4|0N4ro>ZL|-(xca;TRSXxJ=CoeI4K`gZCdp28`$dEZLhdWGRiWW#n0x6*&d+L=`Zz;
z1R(_BIymj@9ZUmUua+1`UUNm%z?faSr8D+_FYeheYqdfbXHl0a+Af2*s`qAV^6wrF
zZH$K8sR@IW&~$dKQFNmvO)$bQaxus}!upg>_)8Pmx||yA|Q$D&Z*hp&NYyz-ng3
z=7gk@Yu7hRblnj3`C|S(M#0_S-~@jk-0fiyEs2{cZJDL1LIOon5{Oq7V2j~HkwizYGNZ>N>!>^4=hEO{8eT`_&7mqOrw@!|c>2Ub^
zY1zoT-?pl1ThLCU-ClX6Dc89$1_s}a3YrtNmASqmVd^{LfwsEh&nX>qv#n32^2b3Z
zwuBdzDAUj__mLYfGnp=#m>xP#aIf>v8w_ywh7}+AL|LC3u#Fb?J9kMG8AH>n3>A4Q
zc6~G+c_H^LU2yft0m1R$Me@~4q9TkW+A
zC5)Ui(vI}>c*X9=jbNqN$|bW^JvxS}Y3Eo^mLv|@;^X`BzUo{<-z`g-Wz*A4&+Yi%
zb<`=Tp^<&nvsWcJ{*=JQPuSjV5*?w0K+4O57DyoTiNKWggwOuY9iws6-DqDCCB$Gm
zq1(*?Q?0bGOvjMXW^mQS+melTZ4=QDu!vL5O;*c`=8`r0rsD%SNhWXeYCcIDW_XXq
zL5SSa0(CIf9D7+?d1X&>m~pvg<#y3#OmUzK*JehasNZqioGee$PM{qo_4j%}SgWDZ
zt_cG!Vpuu!@XvkJ6XmzIg1_j1ay6K{;(ajDop
zVLs13af#2WA-#w+ync-H9_@>t5!S#{;>Q2Va
zd7*{jc%Hx9m0PMk)2lqLyG3Dj*Y*-u_4h+}>++jFRX=k7)q0ISYyFW^
z_t`J~Z16*hvvsDYimk2$ZKqgWU{+`o##6Qi=VTNa$?CTL%#$h#beWZoF9_7X;$n7I
z-Ymk4=5zj*15e;<#jV9Bjd3G)?LSrLl|!|U(P7)vjW6BVq4e!Jk9ms@)~t7vAL9#lWjDFLGlFZKgo&=hs9o>4
z|CTo-)aOl%`I0|vR;jH}momz#dd^~pe{9~!-)8I2jTuEW=GOmj%-hwT4u`vTvrW`x
zyXV6iOay&;^Su0an(!h7JrDgp31!(klUKH=sQfW61^GC?dywkgf@ZVK?Fx}uT5nz!
zJ&r3jb_eN^1jFG|fU8>&`>rQAp>E%{G*HKqZavo89_!L9a0flFVuZG`f~g|d2+=dc
z&oj-_(%gEFU6&bn@v0i<0)H9E7S^p2#ySIzf&$)v%ZMPwzoT#1l%u_Mw%O==F&i4X
za-1M<*}2VLTyn8@vcH;WtS;(`|GCt-^${(+^TOz2b|=<<`n7jgO3dB5=)uft6O(g~
z{!a^FvZ}9yTC{yIr;mBz-|f#V^VGk)G27GM(;02hzOTPe8rE2WnePx$oUW*g+I0?V
z{~B1fy)#ic9@%`*B^Vm>W{MMkKa9*eP7wRP%y?^91np61L)q##s)FtmzN&x`Qb5}!
zsZ(lvHsT);W^@*$p3MpxdkTLt&S>1(A<@BO=GSs$vvky4CjLFD2`rro|9Up7VLE52
z{Xntq>-ONi`TB%5ag@}4GaVuh0{~0q8DfY&HGom7iz@F;^yOL#a=!B~=C5G)mtt4p
z%{Fb3W%w%om%$3Wx@)tyV57SOLx}iQXXh2TK0Ma8UjAJ@c_3}wAZVLPFUH8pOu}w?07xC>Qj3>HT+1ToQdL
z{(W7w>oo^m^H$%|BL|~L0@NmSnN8+a=}Es@b^ARSd0M|yx#HGw^WdJ3sdYrI|J%at
zB(#Ys%GUGPDWYk1ZDP%?l8#R*n=Vt#DDa?2Uy3S5zzIEm_24o}Sf5#I#;eleD(<%H
zLz#=G%Ws;eNyMMF2=L6)rhL^(tv()=SH6MND$Wp5nwid%Av=c)pOa!Z>kdcNKG)nS)AdY(i!9$7+39o$o1i(lel2N&s_B06fyFdw}?ZRb(w
zcC+ZeI_U$$zu=_rPzEVhJoG!_yWW0GsOv|5?vA*%wTex;zRL|Rj<)MtgldmVMojc9
zXDOBycsci0ILi3gpn3<~A!jN|>Fh=MTcDONJe(@M@$h;4mxr*Cr?)0vsq+JvkYw2k
z<9y6SfAD0^X7Qz4FaLm28vyI^$i$JnEuD5`m7@VnVkVo+BNW-gEVD@C(
zbd+P1Ru!1ny7oqiDD8n`itRfPe$o(Q_hE)Kv4PpO&NYc>Kzhr_4q6vQfZGvOC@CwNQ
zQBMegN%EP6zLIP$I@_yB-{-297lhfcw?h_I*z_*U`mW%fLxuuzx>0heeo6X7)LkPZ@uMkgqBr8rbIgiWe
z6H%MZvJV{q04T*iL$~Z~i|uI1Dgt7i11^vw0fB>sx@|O(LJa5h*1G~5qtEQee%4py
zli!xOjXjgiL|;_J4_hVXpt(o&9^a3R4yq64Z+cFM*)hTAvinM=tMVO_Bolk{53VGN
zCze~F;AXQQcx&$b+}K_9OseJjcsmlY)f%G1wr=od@8q4yMt^-pRDqQypE_-@C?QVE
zo#)gFN!cTlm(a*z(Z{DRu0(&!TeqbO_DyvzTcc$!W=XB24XI9ay_Jb9X2w+;R}Ln*
z^~#^)L43_IN$Yz$6+o*xQ|nY@mb2+cdoRuqCh@(RuX}&y3q?cNO>pTM<8v^I^YS?W
z2Xs#L9czK9qMg#=|8|1K^%j*QBL|mnaZOic?@aX7{7_Cpyxs`;sU*M*Q`Eo-)uN2z|PiUJmgCh@iGqqFXK|dGAbgyWFv87d9W@^GoD^(KR?Dm!_D9>%iFfcIO~ZdR3Xx77BUA66~Yu1&a>*-
zKL!RmUW;FO&sarBY%G?k?HUukklK;H|F&|;Yvu9iY_#=O%9d0ojT-)&-jV=%j8W(0
zjkS|696!yU-o3ZRk*H|7#2ABf=;1B4WvtPp38b8C2`4C#fcZ;jrt4Ubk#Yo*L#lqUYzZ5jur+O~)IvO~=hm54lc4AlHe`X{c@7R6mih
z+8`&8h^PLKBoSNv7d-TVZ2!#;ZILcf=8nNTKi$+xWU{u-mi`K3EBjlfqXnpLTUIK_
zGSF22N&^4s$#$2NJ9Lx2juoE`U*Hq^SL6$Fpf^+Q2q0O$)4bapOFP)zR#Gf|Sse5f
zq+_3A0)3=x}r}U;_+O^7jv-_ud$9aiTjFLsO$Y{l^I2
zi$^_s(oZdc!tHk{dAe=8b5IOtnr_l{6E74&KD?Qnp#Nfj9^N$|)zz2)pqi19pkOXxY1o|r!xGJd3MnURX&2A|7-i&<;82W3gHsNIY(1ws2oxx{7%%0w*EM=f6znqY6-mf}Rmy>i
zpDYO>1toW9yD`Z7O
z?oxfst*#PZm7`n)Ji%L(dsFNyP13q_lN2?|RhF7m)Gr3-CXh&&iA;wL=4plsP6-eb
zdGQDyL@?GZ9vOY0H8*fWbT)dEFf~JpB8i2UN?Ol-&M+9y$>({+
z29NkZc_&>hQ{kfAkBp!4lay28U7p8D8f=jj*PP2TGHkk@9bEl`vd^a{F3m5xCYpzi
zzxCzR;5|%F>AX);cT~LElkdX2&jLRD;pEhcJmp3~Ps}pYm7zx)bE@gAn_ShzC?$
zk2Z4rX8L&>{+qql3ReB4l(BoWXo@6
zkQQANcD6-bCYAc_ZkiLlL&>F{cOng+7YQx3?1mSmX|nXL*zsmei)5KC?Tf6QEGkYA
z35+jyemDOXEuwQQ_Axn)kJe{N`WoGKYQ(INBj%|_&3uEgJ3sT1SPqdi--bl+VQTCaQ~o(S;IA3^JNQ0nKrfopD2orK7u_|KN0aoo^ap
z18O$@oDv}1Y#VV98>aI602*?PT011q#ILEcR7){^Sy#y|I{NlglC6j&GZ$%Dx_Qia
zmug9VDtzyytopsTjdUVUold&mZO-TR;Z}U#x2vPZ@~S3p;=8Mrp~QM#!aRF-llhXd
z`*%b^SjMyzI=84y4j}Z6g+1(ENAGq?rDL62Z&~|@eJ$o52*pE~TsWZvAia4Dd9YB*
z69CHIRs}~JLT{bDL=jiCYx+lFx7bgmZ^}TP_s=toQ0bPwXl=PvOSJj36*VAQ`P&lR
zV<{9uO0@^Rz8h=qM_>w2ItQmI94+OUVyjpaJ(MdAD67xPBMuFxCK2uXPW%UU0}-wo
zn^6j9-+ah3oa02*ENo}?X=3{K6boXs$LF>)&2^9?h(i~r(PHDuPEKe(b|S|NIc>5_
zC;+i%Ztd(BaEcIOlzj!FtU@yuH(ol`5h^nYl;6K(=90dSQN_sQ*>~|
zaL-wCHNDP89f}y+hc-h>zs!5@E~74n3FzY+X4h4gl&?>kG;W(gDY~SO^aP-I<`(|8
z?DFQsT1m0*Jrbf*zA@AeZ=|x(cQTKYK&^ZZb-s6@jcO^UAr4kbJ$1N`ske=5^SOHD
z3ze^1{|GhQdQ7P+f=GA(S{G8^B4Ak%BYR_UOwE$VqNlod(zvTt7fElScA(@-Wj^OJ
z)Jx3;yz1w=sCjA!r?md~s<#(|(-7)tyB_Q|Iy{|gJymx^f(R5EWX0Bnj4ufc==-Yc
zrObl%q^G2SnA=7-PFCRuXeC`I0&{L|Cw-a4ySu^WYD*3Jf8>s86m&&0LVuws34iFw
z#mV;T&N-Nw=dqymBgnI}*|>4|WPA5yVLF_y(5B;1#rmCuJUt2LL
z!e~rZYX6R|Y|`z~KG}g7Et(Qu6Dt{^NI|)0J&ok0Wk^Sj&G6{J$E=_+F@j;FhpTP?
z7~5Y#>}&>-7X^JQRdi%KQ%&-W%rgFH)#T6x03_
z^?dkiqz2N_b$TTFr+qh&`d)$Dj_-x94cD+#
zjKR>bzEQt2m#k0ogqLiPEKAQW(mIa0m?i$kgpHh7+&ufZ?d1V=4!Z5tpPV`uB-z11l@t|K(HK^``A*cVg4J#uI{~E(7>#(
z+^|PIR=Sq|L~8{96tbYzPCA*!zVR*KCzG|AinTRE{kfqYRAUd~^cxGAe596RGI2
zr`BOm;Q{Nw)^zO`PR<7y#hP{Tqor!=fS|86=R9JCUuotRH^F@{y3{2JSxHJN{P~Cr
zZ7&&MsAO4M5I5I+KdW-_=x#&+SQ|NgHQrbX-Gf?8WdCzZ#s~2ut_&B{$sl3t_
zvh3Y;m+zmbjiSFmZP+R#eJS~AKhXMfFf%|?-W~ExV;C!KEV9g`aqzl=EWB!$R4+4$
zp1Fl9ELX1g8XJ^@tYr$TB9n*9W6f;0eZ{gIW5x@;*5x^$F5WQD@WvQh)4N|tB+cwcp!8E9m3Z8AG|lO^W9>QEY5uW;DyQhHpJvhqYOXna$U+2Zo*M*8|o#hZ-zq>fd0@bW3Ym|M+pQuDa
zCP#u{FF#QCH@ZWqyLXSaBtIbW&J8UVp|h4cp&5zD3;QX#ZoOoIuZ2YowhECCYZ2O!
z@tbpL2Qs2OOD2yyt2+%OL|-v43G)TK=u}#<%|2H1CIq^)U{yGlH%!Mie?p4+EUWUxY^uh-`D9oyk
zLvzV20}Z$ADT8|0RyU&jSa{pC6TBV_jE*P{3^m2|i-2g!{YY08w9^UL8;FYee5zIv
z#E+PH_WdrnhEW0lTTT*m{BCiIO{MxjqFAtQW5EspmoTfLCZ|TewRcr%6!IAZZ3eS(
zyc!O%+(<##Q`iPaS1mwu5R3Ur_G4_au&IFmnRr5h#tYJI>{o?|4JIZQouUf-{
zl&kf9N~OJy;o3ubz^yKg#iUFcM(!8fio>f){?Y>h=UV}V6#j_`3*^}fIWgiW12df&
z3lS)}QcGFnhLj3x$^82NgUtr#k09l{naqDx&8_1SJIMR+#Z{t4?)O1s=)t1!5dOgwehJHp)y_Z3q-ZFbxIWPuBGU>3y54g0^EO1&I
z`+DRpIx@PRvcL9;G^q0c7Z3(E5F3L-uf@v!J}0c^Ht)~Nmty!gSpJ#c#-!=~DWPUx
zxWY(p`Kvb=yev(wxn9&Go#WmnC+a5PF}>&;x(zxIYc6WNxeldVVUY=L?ZxB05=Rvi
zx{{)k9v+jmiN<5!Z-qv6yx9x4%n;Livo~TKjEI7{FXdIZ^=7G!rX~WV4q(o_8x#-k
z!+hd~uLDG*^O$xuA3&209a>4k(|Y^~itNZ+klRAU(6JQwO9qD()YE1D{NRsPM6dVf
zE81xM`D*v{DM+in@7k}$lq50Sg=p+vTHx30IzOQ>Wu-FrnE>X`4`H=zKAfxR7a}Wm
zn8vGTv*^;L9Z;ku{3!*%uA)18k>PtPCrN4W~eC_*l#B=e}}eY7(aXulY-m!v5p3QEkef
zwf8pKYCRlne>rCcB=xZaMdtm(&o~4YU4>D6=&H{?AnY%b9|5~mQyXyWiRJ9j_1e=H
z1`ziy1=Anm*CxNBCB>38ndI)#_
zguMOClR?izx00-Vkh@3a)(PZKi}};`f=Y94y>D9ecBS@s^2Z5Xqub#pAPyA3oy&kJ<4FRfa
zWHfBD0-W3YyD6go%oO|lSk9b+
zY<*4?TfW*lc^*TXg@ivTtJHdQzP9S@&FNBA>Gqp3E-TDP{}A!6hcH9w3Ae}@AZ9$W
zK^wc@k{x3J-GMOHw)ZLV_HSrq(~Z`Ex8a{0smMGvYvG~GDBsD8R#wXg0dBqw7}=+e
zhv<_Khf}|eOq;=Ad@bGA+4}I`R`UP9{>q=VK-^_kJ!VJnffTCn2|$2HGq@vE}>S
zWB+Pvb{K?)#(=&~zfGu^g@L*u*!ag(#v3yoL!jIp^b@U8r_u$!u1(1VRF>|rJ1CHJ
zo~pNPq=+mGM)Bo1_4rkQtXd}<1*S)=Y47*c+m5;UiN3S3`RM>o{xW3HMwP)ggk^x_2
zRx`sWrw55g({)nV^+S1QO$ZBX*zN0bAA0jHZM0j&Jx+4G(p}|RQsk{XG6$nGQM;zN
z6^`UuL_D7Z{JscEG@t<&);om+aV+R)1KKjM&Dn+&)-~;CLTbTkM`}8O?@#|2qc}*>
zVfNZtg97bZ^?9PJ@B>%TWXh_q6xsRX1@Mn8|5*p%Yqe_1d?95J&?U!Bh01P^(Oc&>
z*&o5{bBRb*sZ$Yu;i}&)M%{o|Bg)KM^rWNLRua`O9d#eSD+E-&rdAi(zA6>+q$7Rp
zIa8P+|H&6PFRD((y5`f9Lv#-xGxVWj-M|avWj*}G%wO`4&)kS%!8TW2w)*VWd!lJ}
zK0SbO&HLLY4FJn+w@;mb=-)fr%?oi5ji65wcuW%vuuvYGbfXDe>2JAeNtmmXEadc;agPMwaAuw+v6y(US#T-1*fNk#p-VERG)Z%d
ztg<*mw2*}%T5w}xqKNj5B|MdFd10W^oxdxQh3~y*F$*6mEq=SCDi_*9WXc-ITc#+f
zWym7~OTES~%6#&r3l8Z?(-H_Q{Ut7Jx}Vy+AkZY|0^a6wGzV`%I7ou;jj8REe&dR5
z$JZ>2R5Zm49Gx9Cjx;F0GH1m!@NBroJ;-4@rT2f)O2wm`mcNAK2vKJ6W9hr^i86JX
zp5$c)k0r^_G?_hWkBn4B-EGN;5WosHannbA_12vd-H!EpvYEJ9&^Rj2>b&j8RrwVt
zCSJ;mu2Jmt6jY;}`+8h=LHBHj6?oExD6`K*E&?PKR?(Hqj?*TfB*LK%K30tLk$}DT
zBD%Vf7WRW*0(`X!&JXoqb@`=xYREk(CbeLPT$NHJ!c_Iz8%fHeyUvI5Ms+OMZvuNq->zqJj}8pn#q6d8
zW4}*5REYeB{#-68)~B0>lTSoy{Ob1wZY|~!op8j#6_3=Nqea>t-792@SF4kKI-z_F
zWKtxjAo_N{C%EPwk)R`Sg_{g5h2>+cLlTK
zM{Vf*Q)?gb(v5X}&aLm%KL~j^q|+T20Ba=*6bGteIRYPd+q4
z=YpO=SqXgsibh~SY&LZ~(5-hw(6D3lr)_0reJzguz1a3KvkZ)c6r!(xR)ZhnKni?^
z!r`|LbTmma9j?hhXx}HL0={2OI?X<@cUAN{=fgPD)pWeu_N4#gfPhj%DP#h2ApByL
z6n~(L&bVB2j?KL==FfGsvLX|KrHnMe+wdnBgXyhQ@<3M++H#Cu2jdZ^C(C6zEWv2V
z&Qcz=oW7K3?4Bj89X_1{pT^%eCC`Aiw1^iMr!}6oSgIPDQ4as*t<%(4IVkTC*(WBS
z_pz2o&+}3zs!0QS0aE&{6x>Tnn>p1!OD>90GOHBVUBLT=0l=!y6Tn|k`n4_j8c;#Z
zS)2#DKnn5y$saWEuX^OBL7SBmOu>A}2?L)4lL_tUh$L}brWrMFM?2}(rMNW((0?my
zo_pUn&Mb8#`37akx2{gr>lmRiPGv~$d`Z(i@P40fUK_j!Rdefv?q~i^d=T~A_gW_o
zB3@y(HFCyyl~s>^0ObG9@y}pk{ujs?*ZQRsdZ@#nn-pY#{paH#+SD%^QJC3uy05Es
zx_^XrLxh!*s|D>`b{uG(|2-X2RsT;N3r<4}S8}!K;PBI3XHXlR=e}^bnXO#nmjPd0
zPb1I@AN#)I{`>%uw0HdT%*-x>8Ub=p7Y8h>Y>&R_Ra-5M7RM`37Y3fd$QuM6??T+I
zesoxPv`#1xZ*a}A&*p9S0uiU}*qxrQG^2IK?h&Z?9v;W0;Iv+E&{(&8!1sO?0WKWiWe7DtFGA!#O@WMjFLnbU`lX?AUV_zFGdcfq47*(h!z>n5u(Gf6K_{
zN~h(((x+p0rjTfNS0_Jag|jVWxH>Q`fn?z;u8|>`Ecak~i2iZ-JyPC@=cHH5<<9kz
zsjei#Jtq=Ksn>5SN(EM=Iy{fGw~B9^2%Yb@iB8xUjQiKzvGwiK|A<##M(SLxJa4R>
z0}H#PrrETFGg(0NihDL~?hy
z5&SN1$iZywl!f{`_2QPE_Rg^yqu6l>ML#tq=87e3*qd3zrY=k(esnR@*{iug61ACM
zdjSI3q>uzT!-59{qwn0rP3Gh5$N(-3Y~U0W6p}An7NhW{CbW{f;(Eu3UvK=bQ1@s9
z9iwb1D(szwoEY|fX5qu&U+S28la%!%@HS~@%=`2-k-GFaj3O_5=?-N{`%6!OyrKnT
z`6q9SWkpS(EyN)nlNUP{$5>l{P*et
z);TiW&tZ8xDQ-)Xp)nC-Y(uCG>Wq&P~LfKUlzUIOSLY5wdw3Hf*u9tvOWTQBTJfk~cFYVc+f+Mr>9br9|5S4NN4
zE2`UJuC@1sk_YCP!6ef3=W9B7b83gWM~KK~inoLE>u9pkgBjiS*T(K*bgpCEjs~5I
zMkFAb{|rW`ub;ER!oUc+Y+W_JxUHW
zTs>VdFrzX@9`_^?o~Eb)+qlvnk=!{RV4!K>{of$;Ycj^Xu!q`CKc-*nKNMl)-5i($
zn%>vyv_VIgiAJCn##NPTTp9oDr}AP_>3A24Rom(Ka!s_Jr}WQlF4?UX)PUp}NV4n@
zcpiB2(~
z@QuX$Floay9ebL92KRpO26)zq^6xdD9b76thUo^-5LLbYO$06ckNP#--dX$l!MDAK
zIJ4|65M{s7tI6)IcYJWuZp-pZpPLW{5g%3FoOhA+YH*ODSH^ADCQrQxDk}T$jv)wt
zz~jesKI6AOWKF9AO>0qAmHxQ`GtjX}1jCQ85rlTVzSU?|tgWXD2Z*#PUFzuhWM-cF
zTkRiEKcp!d7iThh0y1P+ARiBj+7fa(u(LDI_mR9-ay0+60H4?9|GmB-7pd<}SY?p}
zRi62P!NKzwf-Shf(WOL6c54mg?djq;n7`90pSR1z6_)VB%pjzp718YVa3cYkc`T?{
zv@Gu)8L0|~aiD~C5V2b0jLUc`OO3}Q~4w0oxbtQNw0f!h}n#B11}|*XX-cX
z8#PrP>=3r`*tcHq;Hx?sKm)`<8UIvHOa)Pmgy1AtHSBGFI1?4Pjyrg
zsdCzxcOp;Gc7pBS2z-S%TZXvcCEjx7ePQ>k(s#{BUR2*3~NJFGqMaxj>n(E(0
z$XWvDf3S%b&vUPuMn-?}8P8Kk2Y-CYA(UQ#2p2l>yZp0
z?fSov3Ok+lKf0s!K5S=rjR2(RIRQ#$~mfPNL
z_kUFf@*>t=@qS6|uUXl$q9d2daK$!vuf!=*Mt7wdQm+0?#emVn^1R&NF<~XOL>_Io
zv^`M6V(eqp#Rb*WH$29YdziV7UGvF@Kq}4ye!J^h+gDyv0J(lHpMZQ!2R!cXt99O;
zRputkHIsBH2t?n#8@ut1@E|&|#KmdLv+9kZ=ZwnV+IUL(Yk|TC@JvkALCf`y5KOQ
zAEhAJ%xYfA%$;6602dEOehE{|baJLCz2-?9)i`>G2Qyj%Y=N(9Cup;CJa2HdhLOq7
ze($+X*x2s|bHoEc`x9@JQ+!htqp0SnbHFp*HUfs)QqrDA4y-F3M4;7U1Scy53ZxEc
z3+tG-3W=H1tQX8*B&!oM=jP`(!o<@wmofJQaOvYn$z-VG@>LJ!u~dpeFlad9g#>;2Dx`
zSGQ$F2yD=caz%37IDBXiUM>hq)iA7M(
zel7zRBtM=K?8^+m&ifj^lU~6m0O+1GDOtU)d%WGV9S*oO&Uo;CBRcDY_iQqJb?R$+
z+gdSCK=Ir4!R)ULoj2w*Hh^ZDycUq{YRxu(oLRaKyvQuUR^3j{T_X(5X0NAw>Q0~-
zeKHf!R0_SdEh5YcE<2vlVZ9li6|~i$%B6cw7dApbmw9*Dsua1A!5uPmCgKT)qrlXN
z9bdL2cPGDCb+*A&-EoB+=u1_5A96tX$l73za@?F+zq!e-Q^4-XYBw=VM{Qk#ryZYD
z-CMRVll-vYwT(rZdI1F#K{b#aY-}LFzc)l_MBf`fB=QJ=*#zUIYQO*ZXu#uq)8;7i
zDt4Y?kY0)8sv3H}d|8fiXBsaXdgn%gSwbpH`c~4D=a(D6h%Z^oF?Nd|8ak+c*J)*(
zIlVCtR|7ca^qTu1X9NH2TRm!~eST{aF3+|>N>{-PHR_lTcme)}-ZWh&%J=E-uL&Z4
z4IS}o=F{s6cVHIlk`5TUhfz&TfG~I===ln0whh1irkhXH{BeEF!>ku$*0}rfrV+4;
zA@h#@+*^NY;efNSJAhs~v;m
zxMM09jQkv;^(b;C@d20|GwaGd8oBj*v=B41%gP
zka@))j*&=k+w6F|Wp&JcwaIa_7O;jNC#3h^1j#1+1g#DmA_Lk*KKPx)0-&tnNhchZ
z`uA2lq@!miYl`(=Ts6^iHTOf=ny9SE+ufxk2?|vG`a#t(3C?Tr3C#Gzk@yE^TL#83
zQ|SjzhR}lNSk9>j`ubZqrQgj8P>$Yo11O`&-f)qttxEuk#1;Rjdnz4g?Zm-W_bJYU
zH0|A~I~v9pE!AIHn_{>AHzQ;IK$B9e_2m
zjhBe2eWO8HTn_B2M|b@um6_CLx3bjeCCSXSbbN$j{pEy~?veKCXZ%TCY-a-}jSmFq
zihDpR=jiRdu2B~(zO9bFv&s)tc{Mtw&RT*)H`>wse{Zddf|<)7CqK}b9-Qk%?Bp7+
zRYgxC)M397DV@^~E+F8SB!*6#)TtF-y4pTQc(=*-*HdI&XZOzTGcWS2zupQMWO2B4EGaiI~~E6+Qdw`$bGh%S*FaPpiiP?BIfC%`*`u|-zj5yv@2Iy|BQ
zk#}W@?XGPj>0Np=H_&RW=VZMM`S_EV7;Q2X5;uQ%cOig}k$LpQD<87d63g*s*YGkGeZf()L>AVDM-j`f2lTSx5SXvv<
z1qZEL3l4EYxn>52^r+s8D9NQKxh-WIR~ioW;b=j-41!YB}93w
z-moPFaVe}W1D+HOJe!gWTvGj77TDnEntL{5jC=M}ZG_kTyMZoDa`$N){GQ(qqoN9a
z>Rt1>cmO6xsk_su2a^c>{N$2$;djrJ7bIS)e8^Thcv(d5tC!rfRwn(r7W~!YRrcVA
zjF%^95AIY%u(AZqU
zD4l|Qdp1(aR0%wl}LfCktG{$cLMm#^wO*h8y-d&ALR*`sa
z^h(w(W(Fr9TXO4w-!E}r8#`xRkGj~jtTqv=UUqZzfC0`0lyya?O-_Mz(aUDLY1xezu7#Or@WVnC=rd1QSeu5au#cUn@~y>
zk#yZBBvloiknl;WF34oYH`n5xAagNbj4v4ZrF_+e`5?jnXLqdHJ*6t2wIfk{5!}_VCE~g3(5w;1
zjFWiBPN#XZLOF*bTL$>Ll)3%}jch4@)zZxUr;JWw(-kZ3>2kQV3?Ai!BrAGIZSOHh
zVSb?WHPtVo-NF-M5qF>8=vg&S(ww=lUsT1H
zxHfZHCCs5c{`XkFeeGT1g$LLoa4;N;`96o-X&}-h$-j$WBU@hPhbxODpqC+8SY!um-p
z*1eE?fu&|Is;K2YoQq(q04fy8m=Wv?+TbHWGQ=FvaJxBf!GIc~Duq5AeVxwn@-#LP
zxnngiOAe^J&FMOUfX-YD=uMq@PE4XgU<<{GS_ea!7x=nb{#gO0(RPf(68xVqrW(b~
zKPKEiFHQ&Z@(|zRzM;9iH{Ntq-MiPN-W|nUo4M02d!Qb8XL|Q}*!=5Di5$&>%0Lo%
z*&k?KEg#?$bU*i#^BxGem#zK6z}Yhc0vh?fMN*!V>sXJRF7
zfi5i2WvGDCjP~NEz|tGey@JCef(`nqZVe9&(m_cir;jOa+fG!3B*}LiyRgPEzp-lZ
zabBd4$ltE8=F{*$@b-lf`qv(EUVwb#CBoKbp7Aob&*Hm%cbNw8V-dg*SpLHh{%f%@
zAY*#-xT@hnAy|z*C2%PA7{-LcP!sXvnkuS%ntk<~>-a{&l@3p0`^}lt+B(&39a)zxtvfxynlkHOLHu&&
zr&~!GAwp0SgrBujRi2XE5GoDSy6ot@NL3!rr@+b0^TAbiAV3FWBL|Jv0$moO%ADnf
zYMpds$m0?1`LN5yhf7bjH6hPfiHt@wH`uc`@KvZSSoaz8jpLudzCldUzt@dDU^J6D
zD(TFeQj$bi+dpH3s%1U~ylUZu^>Rl{c##Pb`|t$_dp|?ndM|-LjLj&M&1O~XQAyYs
z*PKY(qhcbQGcnw6ncg+_Sn0W&0`6rxKbw8R{06Y^d*kefS?zlR_zB`MvHiy}g)*BC
z|KS}3vCGp>r_Dxv6Ns`aNSC-|=0R+JR6DP;Yxjx?LFSv1{u3Ev>f3iLROPv#13Eh6
zV#@w6THoD{Y`;%sq3BS!mHnl765J`EJsOYrzy4R;cn{+?`*uHqHfJxkMGeJDhVn-~
zfccyvNboDG@!EFd2RbrqH^rG_rA|V&P7`|ulo3X{7f4Q90IzgPjz>0L{Oyu5;CD
z&aT-RF&<7^WvQh=&AP&f$;u9`=8$f#S3w
zZ|aSK=w4%3ZL?F=GB#@D%Rr|0UR5QQ^f
za8DCpo7bM;b9TgW7lbwex?=YP!ibmvM*+ey*dhl7jix2705jN%cRjpq+dkVFM15{3
z7nB#I0JW{YaAh};ul+39><8IAFKy$4>*6c8neZVd3a15QC|LuT3_#
zVO3@TuGQ|~j$(gw_rrKA_TAB3p>gF-#cgJ@gHKUYUE$6$yIKKzHarTDXN^PyV@g1aSymT`cKb~%tHU8CO)EaXQ(dV-}Y9LDv9#jbUQ0Bg9mRW}rboO~ea~krR!VA?+vn
zcgud5P}E#^qpep|n7-iC%Y2UXM~!UZ)@9w9w`m*zzc;h;I+uT53ovA
zaMo7Jwwt+M>kAEK2#>+|XJ{FAgxu27Xmj-^jV#ZS?x?^dCqG~;P=K-E*)0=qpMT#k
zg-#4f6X(9NgB
zUzK@!GKHIH1k~LNe1N)6^I(G}Rc*fQ`6|&KzkY!mng#;9<0uq8(sOb``HY;`lfE
zI=j4yvH@vBmmsgLh*nAmtQ3ajrU%LPn=moXMF%A8k^(O8cJdeB7h`YtV4{XTJwDl`
z`iX!Np9L4uY@p+j1KWjY&^4-|QVW5z&t+u<6tWD=D5s1Tf!ocx)FTb}=}gE|TpGgB
z>hJp-!S#-`ZdRz%+L=F=9vp?zpGQ%*Vd$EsvC2oiYihkGAhkq#Kj-}enoNWq}*jy759fm_g`PPw|r4z#;>wB
z#8-ZHTJv+U2bjkeF1yG3dDtzkO?wg>hTyQ^<#fvf7n=wwg<$zXq}HGe{z0q&!a=Iz
zX24zinEUG$%ZQSg=05h^me5)BhV641#1r)VOUH#7F2bkGj>9jz5vvf0#7Xd-61LOQ
zEt=+e;A$nSlBeA(%!I{p-(=#&WGa(eD@lY8yxrbq@8$RaA(#ZbP#`Qtae$`m9uHK_
zvY0zjcy>fa)POG`_zn*$`7{4^1=PN5yX2BvA@TAwr2bWEayYcbgzTrb*W<{Q5V#`F
zMRt>jO19R4v+}OD}u65%J{GvpuNC0q4oa=d+&Ix`?r7m?4ryNLfIvfm6DMVWn_j5*_4ryY~dtZ
zLS?TMNy^@*O;$F^-dpybC%@y(^|^o7eSbdp{q=ZU|6SMhex0vzjOXz@o=2drV9q>=
z{VL{+O?wb~lJ09p<^sB6&%bj&DZp(1R8xpLH;lYOzloa%$;0P51zC;@*oi*+f$Htp
zjGcbxVBL*+6EIG1kOy)z(UEQG4Ac~K3Xd!}%GzpoFru_I&|YUsYH_JxAPJvVo2V*@
zi6%^+_o8O+xJ>`Y=GOjj%@WW<#3dgyZs6i3)9_+7l~=I9$^Xh^L}S=u-TuX4&PTg;
z4>+YDw6T59FFtqZ4%T`nODB2OW7*4KQrD>*Yz_(Fw6+9CU*{GYjV-p_&o1?idI}p;
zUo+d!a~KJS>mqdJpo%l~>&f?#Z|4h6@1$A{_{c~z-f)rSzbaoldKLe{-0iiF7uqaVY(I5gQh5ofDoVy^+M#^ON7-r0y(dHE
zaqndo{>{^QT&21M1`nx=J#z@0iU!^TT^9Xx|9v#e>P(B|
zFBp^_62(0i8Vt4~)LNbE?Gb91OX5k=uz~KJ^De7#
zV%N{}92VJFF6}ymoO+cIyDKE~x}NHc_!p8KsGH>lOh#V__wMh40?^)$P2ysN#n899
zo&VBeGfR{$)mmTwWLGNp#S8OJrf)JXJ17^EG!QaHlUS}8hHff0#O8O|(fU7M4|%}-
znDPE_KIh<{4|wI1Wx@Nt)2xEHC>b9}rQXsGT?Bka`^60p0B}*nmB>9@$nM#AFg>6)
zNpAo=i&2`q!GO5(M!_*@uJC)9JW4OF{Lk8MMFUV%AzFJx6rVZt9yf~)mc$^gUZ2)H
zAoJD1Ne+|Odm`?FJ8dmZ3SC<%Q*8E7%&yOvDVD{>M;4nM=+>8OXBn=D{v+hfUA!qN
zn~XxSg2P~JGP&6j(rzX#na>zv{_Hg7i<$yo2WKSfQF0VRT;fuh
zK=k5qwq$6WmvuYvNW|+O^s7uOL})H8k@6E7v=5NOJv*u2AW2krTs}i%OJEVzN{^S&
z!h^=2bRO8}*`zK;WNF#vwA$GZICRr)-kMV4d3X~KC4+DivhSDJJk7QVC^rH^#Xu~S
zdR`p8S`AneP4uq<-GeM^({6{N_L*_b6c+sVY1N5bQfEVe!;+vLF~Y|BQhkVVU@ETu
zv1Y#0T1()FB#*fr1Xvb{W`+SpG_i)q2S@{NocahP{Nc@y?xgoz*_h+M_FPKr9)#C#
zh6PTVk6)kWjN6~q+|oLj!ehn#>sZ4jjonfkQ!V6h`Z{Q{MMdFP?bX42tcGd;Jt|Vc
zum$f(U~>0_8{2~}?$~Spx%<|i1~NELtQSqYR_KL-$LnuxX!EsfJ>n6tug(dVmltX#
z95!-0iLbf;uJN#f%w?&-#_5`0yJtl2I6Juk&nb0_Hg4w+HeiycMByAXuKFuSEt86%
z+?8I;HF&pR4AX6?Zzw7TISb^FE=0`#o3lvL%lQwQ2W!&^!lCD~F;pB{3IHAIrSWvg
z!5^?RySK5$Zyr7QMdY6NSam$mrhl+9DIx8t1F(K(sO4AoT!Wm}B>8!xma}YI!#j~A
zyen6@r?t6g$dGk%KxVy6STMZc$sdC^1Ogv>?dzu03l47P?U<@@KbA|k8z1=NMCw$7
zNL1*`0I81*Al0`%D~k+P#ESa|fh?UzQ$
zI)hNly3#`{FT^J(BIS10*P3c1H>ZaRd)11%@%6FXX`At@YLVyGn)Ux1j`E+==cbUl
zyz%c_3=i@lkU~YW+l$V+it&R<`@m5*eNWdhw+{GE2Z7m_-^w!@?Iy&1Vz~~C(fq#~
zO<%w&FXNkGWw20VrF;H4BeFl2Dcb7CJ&cJ(h0OljRxk92G_AfpOvZuR;p7~}`@c~5
zpt6aQ`BG>%Nj^PrqX}xtPbTaTW*O>gYwSTl9rgt#cP_<}uc*L@hW88$3x34QPboy=
zf~xnX6mjAGO9$atG((@m{zQE<^lH4bZr?dM>NUrA#a**@2C^;g<7^8#sbS)<0Lxs3
zZ<4Wa$6JfH?Uv+Z;F9Bm_hwNZD6$@{1=()BDZpwd)PhJCN?-p3xYXTgdfk~}OKq}a
zG5O~7Eq!%1hMy2q1>x{3uZP@a)9tx;MkLh3s$~n$xL$#SEDE7v7gJYP4?KpT7Qafu
z@&PW
zr=38!8Iy#>*PBo^ynxTv1Oz&8?BzDGazHc&xBZrcCH8S{jc(``UZb1f$UxAZ
z=!%a>MCt4vt}hC8A@SO$C-~6o`?~)TNl`~Bui%Le5shbz!$zC$R=rF9L+01~A3~vD
zE@1u_7#aJV<--`^Csf2{ul;1)?`xmUC~3v46!5zWy?4M9pjV1fslJ({FrK^~M2in9
z+)ZpZ3Oe)8Fe}sZ-0S&y$&bB1{|dE;YveeyVgTlhtIu9a%}8{7wDNLdX}mm!>hr)p
z?-gvM1Kys6O`7H;_Q{i>T~MEiy6$oyb;uQ&J>Rp3`Mt(q{qrdE6XHd=-vnV=_scaF
z91((SM^EA<8B*=k>$lq7taGV6JMEHeW9&d|c5?I@C*gLn=pLolMNBim@!#TixT)i~
z@wZ@SuL
z?z^%4FxM8$PtQFf`7KMKMTKV3T-_#{>i>boay&2k4`Gq=G|tgg$mg1#7kin=p>+7x
zjFXxR_D^<4eNLK-j(>v6WC)a&>q8?o+mb{Ja^Vg9=n=9&{AY8qr|SOaRKRo=BwKcc
z=Jf(o>SEe>qWjlr($J{_pE_Pf>`ty4mMYo=4huYe>CUV}tKUDdpRxF|Vs8l>^PPds
z_Ak`eN-!$;Dg*fl9#(%0$B`VF{hqE=y1_$Rfj%cy264e|lDu09x)u)0Z#^`
z`ZCS8y2+NLBlxP{gi#zV!OCEf!CxPKBDwJt8KBy%O}{<@#Cb3?+roq&Q^%vM?9@DTzgA_Vpg%i@>V1)(5J3Ml>Z8Lf>!@)wZ4
zcQu+ohAHx$7>s^f5<8d{2}X|x0OuZc7D>x>Y?;`
zJ{0|$@WV$qi7xx&$JlOdlp*x>-nAnihNJRVm&jyG~@r9G~UXe*v@*fAn2#4ZPdI
zy%cH&t^D37Zg0}M=W5)2SNy)XX4KxO~TJss=9+pfVR;fQcH(vF%5AinLK+oJ@WeG3~1y-2->lF0aw{^`Ozy+2H~C
zerWv-qX#VjpPdpH95LR6l=Tp$2SDGee2Xb5gErV|EvWTY9MvxHR}w`f#tXs&f=-9_;&?`mH)1a|N
zjYwdIu3C^P3)Z+oy|AaTZr#qUKt?hRaPRU9?h{&sj6q_1fZj|)_Tn|Bnei)joODNi
zhV6jY@4%+R`XJz!?oN@xBvJRA@Lv`7$Yyt`!xte+OKiaqMS}K-$%&)T|Mp)T((&8B
zo@VV$`><16zyttj7W_P11qm%d725Jq)`M$62|&)6u1ZG@vAyJX`-!Gsw^VLFVpc1&
zG(hJ=`^UzuRxE_1$H<(?H`v9DqB5M_~{uAGqsG>ve+ihujN0n2iq>xC8++
zYyox`m2GEb$+{+4i6kB}D{K4k9~;U;)W_d@p2?)wo5}?R2rig4XRw67CV1f5qz~}N
z6?~tNkejTjTVKjPIX)H}-Cl#1&EItyEor-TsLi$VnvN7D-4~0AL(8&p7g0Ty@XRW_
zF1BC-q99iO*N-OZ@*W*L3%e8|KDmJkN5u6qoBpZW@}R=SRa_yG&+9+^qH(QnNf
z(|{@VSf8z&i8Vh7X_aMnvugYMd2Jo&+12T}V2c8qV%kY36DZO{yyBYVwg$29MIyBK
z_~L9(J>2l4%?-nc@{l6Q1z;ATa#2S%m+9g|ef9ER=DZD($=mO|??MXzqmeo-++t*;
zMi#WojHt}KnM_ddFi~U{(4!Djsalzw>kNLks=-nd&(8(d?|zB0)640g443)(4~-;l
z<=o(4@t}MgL4;DLLq6Fiv{l>Z-shHVI$T!}{ZY2{eL?L?VA5-;&(vWE>VuA?iy-xk5;k+hUOP(e4D2Sn-iZw_Y_x`-~Sj6d|
z!!=$oe>z)0M1~sZ@8^(o>lBN!?%}`WHO}=
zdp{DGBxAWG-c&f(CPpRMy6>URheV9x;)P-ASE52~E}n5MNRO-tWtSwIlR!mO`PP|G
z;~I7+gc=9gX?9_0#_BSDy=59O+W+;%mN`#IK(kyFq&r=L6y0{_TAU}IyWWn+M{!;N
zgVFDm@JrE1AS3(LEBh4WFg4-5at7W_|Lw8^aff${pLTyH*ozj`GX>8>g
zeRfmJP$X{ed&BFCV$cUCfbNHI)M5b-Tff!?6MVcUtiuEN@vH8H2Mn$EUQL!u7Iocm
z3ND!%EUHH7wL{57-RcVr+Ysu8m~8_0=m08~IS(iA(uhve|GsORpbLaP4ct-Nm4uY|
z%h&D|zme3hyW)3ar#<%2CeT;Rq8|`HtYM$S^X?
z=yywbbP&Rbj8H_c{+!m?)zwW>B1cgnW~l}bu?l@hb{ci{4t#rSzoO`BD)k(a9uQFF
zJmP*#9)4G!?E;H0P9N^*-|VLF3sh6vo;Zhs5s1g%G6Ro)!Efq}^_*LwUO>4>qa|-F^aUV8%i#0?-285
z*l$EKwqyluKf4d4Pc4;s1%W_ppX#9=!kD66#KZ}%eD@xiG?xvM>Aw^U-=-(Pw5_{#dHtTx1)CbHpb*9SjGcvYy95Kp?wk>$x^`-b21(M;`d>5UAy^!bp+@rfuCX*Fux0tF8o}C!gZ#FFIpB#z?Zw|6_*{I{zkY
z>eLPnNH?|YxA(&6Pv`}K6;Qt#ZYkz^8o7A66HLeVZ$9S|F^>h|kGRkPMRJbRF>m{B
z{nwc+fnveniWWc@^4E*{C;LBMMYmFBU9H=llZ9sn7zj@+$4%e8sBVjM{FkbSTP72<
z|Ljn^`pkQ9^+!k3eHg;5!I-)e`ZPzv2T{survbZ277rGDe<|_QA#QE57ESJ(e24F6!c;j2B>;qZB>d6l@1Qpe
z0=`qf^AO~mvp{QFOhjkE_|&^ntzP)No5PO}(&a07p)D}z3HkA4;X^sB_~q}`AS~dD
z#IN#=x(y3(-!V|28Oqw)YiOS-+o(=6u&9Z=T;vmdZ%lI5<t3obkgS6@zSC0Kl2V$~dU`+miK$WyQ<
zcAHG>zDSJ2$qg2d2!-tXlaXh~Vdu`%$sjiD(}y(n^n@H>(Te6We+FNI&h;<;y`jlX
z5Qv1?T;vr%sy3ISAQxEh*1|I0;oIp=0CM!Z{57ChHSYCZijSJ=jE~{1x>|7oB%iNN
zBew1KsdW0oRVUv_q$K0MV|0fGUAfCtG-OJx98Lv25h^97m+&DPh;&n4mnG9ML+2U5
z$zR$4lTIDk57v#C8Qm#uvJd?%Em@YI2cqXyMCUl4U`$Lc$$U;wQGq!TI)#{%Ygf;B
zkzRWTUqJ;R5$#QtMZu+MTD)Is(A4CpepAR*(DaK#`ZiR)jckn
zbue=0o@r67F0RN@t`8(i__R&Xo#M!qd(N%iSmJ0i4Q3@G0>Yy|ob};>SU&W`2|W@p
zrje@1%VxrAL-|
z!l%YHy|uz-sMm8&P9~(e36AdN_E@K}SF-x>#L;HV54>2a3r3EN1dhxG|M_Uwbc5T
za|hT;i6P}d9iivE?K~rFyC0sfJkAVq>kCc!_G;S~N53CLNL#YMcttjB+_;II%otqx
zT`o-nHy0e&6ftsP>MI5*#2xqLrU!pK0qhsW$=!G;=DhS5Yu&@ay~V{3=bCMUKcC@8
z>v>MFk;(IA2U=u$o@6H2J`*ruO+Ni<(wSg1An+Y;l%CPb9vM{e8{Yg56doITJwEBB4Q@-fMZ+jc^!C#7t%)0xEH`Bi@!YDtbk*K__
z$7V}?+Nuu^VcMVL#*)|M(?d$J+c@Ha`5M4Jud6Abtj=9|In_rCeyTaee9)xWUlRLy
zKnvgF)Jlk#mW9}8iNEC2-*#_mR)~PzV!2;5kTA_YOLL2`iWhT%6
z9Hdx327lcbMo!|hQHIJhoMZE4b=7(LB(>LM7M?I32enL-JzgErdk}PLNx5hFxV_O=
zJv#Jzi;|X@?q7x6E-w3{K;686&vU>wnSck451mR+kh^^9FSyI1Cy={LmsyZ^#}O|C
zIp(Xkye|IMC1|`jtV@&mY6b$nNXM6}z=mlICIe~YmBiVZ`}KiYrjq5`+o1vQq9(5Y
zru31Hcbd?oFdlGucU2s9nFj8U>(P@JG@v+M^70FlJ&t%{Y1UTdq*bMHyo~;nU-|eO
zwS7gQfr|GGxq{cgrPtucebv7yeEg+Uq8t*#aYBn)XRF2=GW~oq>*Df4J?ePt_6}Z0
zN-sleIT_YUqa6PAmFep$36AHIGa+V*yT@+goB5bmSsAw
zmA2y`HMRLkaZfaZ*T5nVcTY{SR&M2a>#Pt=gzi0Iiuf@ey1Rf*R4$0&`ASL0qiOVM
zK9Yjvy?hlP{Qrm_?XfEyX^TKVb-r4(?~&y1M<_SzU8)5?`~~yvxfg2(dJ2~Q2@#V6
zg+LdbOm6)8v)LQ%4@1YYG%%`LxV$efHxT9Z7FA=LLl@q|vRs#EU`6vh&@r!HT!oLm_DF}gD=IP5f
zsr8dKr`O8=#K5$=RsYi_ZLN3fwJ*TnL|XA4l_v|UrWdHboY@>h+h1~@tC}k-)Z@VK
zBTTUTd(WwtNFJ5e^3?n;YM(UfXdT-&22Af&{fB23>tE)yJ_=bjt8`p!b-_Uv0rOar
z?xQNi!gyLN9(p(eHQDtocPzEG*a_Tp5{rbdWvt1IR(HTr6F>rXc`N+>nn9*J50l!rIW>fN@aF
z7)VPkp%Xurel#2m(wmcAujw<9zH&+$>owVq*0YUk8JovCXHy<%6+~hA>2EdPT
zGND-F@oQsmNT*=>XDY3yQw}8-u3Cmi=E4BWK_4c11U~KOwZ}!#E(WBv8Ij-L2aVkQdD6_vK8B1+KE1f%bX;k^Kt5FMC4w>ZPA?fT-h8paR&*9Xh8e
zS|nScqz0SxR~xw(&$v3cZyavvqx1;H?wb#@ll7ihC#w9ZP>FENdQ?8wc2`}N{4*WC
zby0pwgJcXSTMWS76J_z;rBJPrC-Tr7?>G3a;
zyN?#A@y7O&`so=q&KY<2#HO(aDhOj&TOV}ad~tsT1sADsSyA6Tc9tFAdE4Op5+&8<
z+O;jJe9pT0OTh`%=b@8jp@09@TLm4-`k$*g7dPWO3I^%|E1jRk5SJWXnWpQTt8T*O
zE+$+2aq4rRNnIFU8~hXJG|&hru=tGTCN>JZ0_or~Xm5zIO{ARNG50P}6uOZKx;^WG&^fIn!UGPKJBhJiO}BnW-OJMXfhTiDIakJO0O#o9ZlJ$+W~k1cX1^
z%=r?xm(nQ*>+xJ##(%u3@ujL%_FV5#dBk$-+?s8WoI?j)Kj-ItGuOaWC4GUlpEr6y
zQ-kpR5Oj