diff --git a/docs/user/config-file/v1.rst b/docs/user/config-file/v1.rst deleted file mode 100644 index 740f785c0da..00000000000 --- a/docs/user/config-file/v1.rst +++ /dev/null @@ -1,245 +0,0 @@ -:orphan: - -Configuration file v1 (Deprecated) -================================== - -Read the Docs has support for configuring builds with a YAML file. -:doc:`The Read the Docs file ` must be in the root directory of your project. - -.. warning:: - - Version 1 is deprecated and :doc:`support will be removed in September 2023 `. - You should use version 2 of the configuration file. - See the :ref:`new features ` - and :ref:`how to migrate from v1 `. - -Here is an example of what this file looks like: - -.. code:: yaml - - # .readthedocs.yaml - - build: - image: latest - - python: - version: 3.6 - setup_py_install: true - - -Supported settings ------------------- - -.. warning:: - - When using a v1 configuration file, - the local settings from the web interface are overridden. - -version -~~~~~~~ - -* Default: 1 - -.. code-block:: yaml - - version: 1 - - -formats -~~~~~~~ - -* Default: [``htmlzip``, ``pdf``, ``epub``] -* Options: ``htmlzip``, ``pdf``, ``epub`` -* Type: List - -The formats of your documentation you want to be built. -Set as an empty list ``[]`` to build none of the formats. - -.. note:: We will always build an HTML & JSON version of your documentation. - These are used for web serving & search indexing, respectively. - -.. code-block:: yaml - - # Don't build any extra formats - formats: [] - -.. code-block:: yaml - - # Build PDF & ePub - formats: - - epub - - pdf - - -requirements_file -~~~~~~~~~~~~~~~~~ - -* Default: ``null`` -* Type: Path (specified from the root of the project) - -The path to your pip requirements file. - -.. code-block:: yaml - - requirements_file: requirements/docs.txt - - -conda -~~~~~ - -The ``conda`` block allows for configuring our support for Conda. - -conda.file -`````````` - -* Default: ``null`` -* Type: Path (specified from the root of the project) - -The file option specified the Conda `environment file`_ to use. - -.. code-block:: yaml - - conda: - file: environment.yml - -.. note:: Conda is only supported via the YAML file. - - -build -~~~~~ - -The ``build`` block configures specific aspects of the documentation build. - - -build.image -``````````` - -* Default: ``latest`` -* Options: ``stable``, ``latest`` - -The build image to use for specific builds. -This lets users specify a more experimental build image, -if they want to be on the cutting edge. - -Certain Python versions require a certain build image, -as defined here: - -* ``stable``: - ``2``, ``2.7``, ``3``, ``3.5``, ``3.6``, ``3.7`` -* ``latest``: - ``2``, ``2.7``, ``3``, ``3.5``, ``3.6``, ``3.7``, ``3.8`` - -.. code-block:: yaml - - build: - image: latest - - python: - version: 3.6 - - -python -~~~~~~ - -The ``python`` block allows you to configure aspects of the Python executable -used for building documentation. - - -python.version -`````````````` - -* Default: ``3.7`` -* Options: ``2``, ``2.7``, ``3``, ``3.5``, ``3.6``, ``3.7``, ``3.8`` - -This is the version of Python to use when building your documentation. -If you specify only the major version of Python, -the highest supported minor version will be selected. - -.. warning:: - - The supported Python versions depends on the version of the build image your - project is using. The default build image that is used to build - documentation contains support for Python ``2.7`` and ``3.7``. - See :ref:`config-file/v1:build.image` for more information on supported Python versions. - -.. code-block:: yaml - - python: - version: 3.5 - -python.setup_py_install -``````````````````````` - -* Default: ``false`` -* Type: Boolean - -When true, install your project into the Virtualenv with -``python setup.py install`` when building documentation. - -.. code-block:: yaml - - python: - setup_py_install: true - - -python.pip_install -`````````````````` - -* Default: ``false`` -* Type: Boolean - -When ``true``, install your project into the virtualenv with pip when building -documentation. - -.. code-block:: yaml - - python: - pip_install: true - -python.extra_requirements -````````````````````````` - -* Default: ``[]`` -* Type: List - -List of `extra requirements`_ sections to install, additionally to the -`package default dependencies`_. Only works if ``python.pip_install`` option -above is set to ``true``. - -Let's say your Python package has a ``setup.py`` which looks like this: - -.. code-block:: python - - from setuptools import setup - - setup( - name="my_package", - # (...) - install_requires=["requests", "simplejson"], - extras_require={ - "tests": ["nose", "pycodestyle >= 2.1.0"], - "docs": ["sphinx >= 1.4", "sphinx_rtd_theme"], - }, - ) - -Then to have all dependencies from the ``tests`` and ``docs`` sections -installed in addition to the default ``requests`` and ``simplejson``, use the -``extra_requirements`` as such: - -.. code-block:: yaml - - python: - extra_requirements: - - tests - - docs - -Behind the scene the following Pip command will be run: - -.. prompt:: bash $ - - pip install .[tests,docs] - - -.. _environment file: https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#creating-an-environment-from-an-environment-yml-file -.. _extra requirements: https://setuptools.readthedocs.io/en/latest/userguide/dependency_management.html#optional-dependencies -.. _package default dependencies: https://setuptools.readthedocs.io/en/latest/userguide/dependency_management.html#declaring-required-dependency diff --git a/docs/user/config-file/v2.rst b/docs/user/config-file/v2.rst index 2e40690b0e9..1dc47b479c5 100644 --- a/docs/user/config-file/v2.rst +++ b/docs/user/config-file/v2.rst @@ -62,10 +62,6 @@ Example: version: 2 -.. warning:: - - If you don't provide the version, :doc:`v1 ` will be used. - formats ~~~~~~~ @@ -131,16 +127,6 @@ Configuration of the Python environment to be used. - method: pip path: another/package -python.version -`````````````` - -.. warning:: - - This option is now deprecated - and replaced by :ref:`config-file/v2:build.tools.python`. - See :ref:`config-file/v2:python.version (legacy)` - for the description of this option. - python.install `````````````` @@ -772,154 +758,3 @@ You can see the complete schema This schema is available at `Schema Store`_, use it with your favorite editor for validation and autocompletion. .. _Schema Store: https://www.schemastore.org/ - -Legacy ``build`` specification ------------------------------- - -The legacy ``build`` specification used a different set of Docker images, -and only allowed you to specify the Python version. -It remains supported for backwards compatibility reasons. -Check out the :ref:`config-file/v2:build` above -for an alternative method that is more flexible. - -.. code-block:: yaml - - version: 2 - - build: - image: latest - apt_packages: - - libclang - - cmake - - python: - version: "3.7" - -The legacy ``build`` specification also supports -the ``apt_packages`` key described above. - -.. warning:: - - When using the new specification, - the ``build.image`` and ``python.version`` options cannot be used. - Doing so will error the build. - -build (legacy) -~~~~~~~~~~~~~~ - -build.image (legacy) -```````````````````` - -The Docker image used for building the docs. - -:Type: ``string`` -:Options: ``stable``, ``latest`` -:Default: ``latest`` - -Each image support different Python versions and has different packages installed, -as defined here: - -* `stable `_: - ``2``, ``2.7``, ``3``, ``3.5``, ``3.6``, ``3.7`` -* `latest `_: - ``2``, ``2.7``, ``3``, ``3.5``, ``3.6``, ``3.7``, ``3.8`` - -python.version (legacy) -``````````````````````` - -The Python version (this depends on :ref:`config-file/v2:build.image (legacy)`). - -:Type: ``string`` -:Default: ``3`` - -.. note:: - - Make sure to use quotes (``"``) to make it a string. - We previously supported using numbers here, - but that approach is deprecated. - -.. warning:: - - If you are using a :ref:`Conda ` environment to manage - the build, this setting will not have any effect, as the Python version is managed by Conda. - -Migrating from v1 ------------------ - -Changes -~~~~~~~ - -- The version setting is required. See :ref:`config-file/v2:version`. -- The default value of the :ref:`config-file/v2:formats` setting has changed to ``[]`` - and it doesn't include the values from the web interface. -- The top setting ``requirements_file`` was moved to ``python.install`` - and we don't try to find a requirements file if the option isn't present. - See :ref:`config-file/v2:Requirements file`. -- The setting ``conda.file`` was renamed to ``conda.environment``. - See :ref:`config-file/v2:conda.environment`. -- The ``build.image`` setting has been replaced by ``build.os``. - See :ref:`config-file/v2:build.os`. - Alternatively, you can use the legacy ``build.image`` - that now has only two options: ``latest`` (default) and ``stable``. -- The settings ``python.setup_py_install`` and ``python.pip_install`` were replaced by ``python.install``. - And now it accepts a path to the package. - See :ref:`config-file/v2:Packages`. -- The build will fail if there are invalid keys (strict mode). - -.. warning:: - - Some values from the web interface are no longer respected, - please see :ref:`config-file/v2:Migrating from the web interface` if you have settings there. - -New settings -~~~~~~~~~~~~ - -- :ref:`config-file/v2:sphinx` -- :ref:`config-file/v2:mkdocs` -- :ref:`config-file/v2:submodules` -- :ref:`config-file/v2:python.install` -- :ref:`config-file/v2:search` - -Migrating from the web interface --------------------------------- - -This should be pretty straightforward, -just go to the :guilabel:`Admin` > :guilabel:`Advanced settings`, -and find their respective setting in :ref:`here `. - -Not all settings in the web interface are per version, but are per project. -These settings aren't supported via the configuration file. - -* ``Name`` -* ``Repository URL`` -* ``Repository type`` -* ``Language`` -* ``Programming language`` -* ``Project homepage`` -* ``Tags`` -* ``Single version`` -* ``Default branch`` -* ``Default version`` -* ``Show versions warning`` -* ``Privacy level`` -* ``Analytics code`` - -Custom paths for .readthedocs.yaml ----------------------------------- - -In order to support *monorepo* layouts, -it's possible to configure the path to where your ``.readthedocs.yaml`` is found. - -Using this configuration makes it possible to create several Read the Docs projects pointing at the same Git repository. -This is recommended for monorepo layouts that host several documentation projects in the same repository. - -.. seealso:: - - :doc:`/guides/setup/monorepo` - This guide explains how to use the configuration. - -Previous version: v1 --------------------- - -Version 1 is deprecated and using it is discouraged, -view its reference here :doc:`/config-file/v1`. diff --git a/readthedocs/config/config.py b/readthedocs/config/config.py index d58af6ed349..5eafa0e3310 100644 --- a/readthedocs/config/config.py +++ b/readthedocs/config/config.py @@ -1,5 +1,3 @@ -# pylint: disable=too-many-lines - """Build configuration for rtd.""" import collections @@ -11,14 +9,12 @@ from django.conf import settings -from readthedocs.builds import constants_docker from readthedocs.config.utils import list_to_dict, to_dict from readthedocs.core.utils.filesystem import safe_open from readthedocs.projects.constants import GENERIC from .find import find_one from .models import ( - Build, BuildJobs, BuildTool, BuildWithOs, @@ -47,7 +43,6 @@ __all__ = ( "ALL", "load", - "BuildConfigV1", "BuildConfigV2", "ConfigError", "ConfigOptionNotSupportedError", @@ -103,9 +98,9 @@ class DefaultConfigFileNotFound(ConfigError): """Error when we can't find a configuration file.""" - def __init__(self, directory): + def __init__(self): super().__init__( - f"No default configuration file in: {directory}", + "No default configuration file found at repository's root.", CONFIG_FILE_REQUIRED, ) @@ -197,8 +192,6 @@ class BuildConfigBase: You need to call ``validate`` before the config is ready to use. - :param env_config: A dict that contains additional information - about the environment. :param raw_config: A dict with all configuration without validation. :param source_file: The file that contains the configuration. All paths are relative to this file. @@ -219,12 +212,9 @@ class BuildConfigBase: 'search', ] - default_build_image = settings.DOCKER_DEFAULT_VERSION - version = None - def __init__(self, env_config, raw_config, source_file, base_path=None): - self.env_config = env_config + def __init__(self, raw_config, source_file, base_path=None): self._raw_config = copy.deepcopy(raw_config) self.source_config = copy.deepcopy(raw_config) self.source_file = source_file @@ -236,7 +226,6 @@ def __init__(self, env_config, raw_config, source_file, base_path=None): self.base_path = self.source_file else: self.base_path = os.path.dirname(self.source_file) - self.defaults = self.env_config.get('defaults', {}) self._config = {} @@ -300,15 +289,9 @@ def pop_config(self, key, default=None, raise_ex=False): def validate(self): raise NotImplementedError() - @property - def using_build_tools(self): - return isinstance(self.build, BuildWithOs) - @property def is_using_conda(self): - if self.using_build_tools: - return self.python_interpreter in ("conda", "mamba") - return self.conda is not None + return self.python_interpreter in ("conda", "mamba") @property def is_using_setup_py_install(self): @@ -320,93 +303,18 @@ def is_using_setup_py_install(self): @property def python_interpreter(self): - if self.using_build_tools: - tool = self.build.tools.get('python') - if tool and tool.version.startswith('mamba'): - return 'mamba' - if tool and tool.version.startswith('miniconda'): - return 'conda' - if tool: - return 'python' - return None - version = self.python_full_version - return f'python{version}' + tool = self.build.tools.get("python") + if tool and tool.version.startswith("mamba"): + return "mamba" + if tool and tool.version.startswith("miniconda"): + return "conda" + if tool: + return "python" + return None @property def docker_image(self): - if self.using_build_tools: - return self.settings['os'][self.build.os] - return self.build.image - - @property - def python_full_version(self): - version = self.python.version - if version in ['2', '3']: - # use default Python version if user only set '2', or '3' - return self.get_default_python_version_for_image( - self.build.image, - version, - ) - return version - - @property - def valid_build_images(self): - """ - Return all the valid Docker image choices for ``build.image`` option. - - The user can use any of this values in the YAML file. These values are - the keys of ``DOCKER_IMAGE_SETTINGS`` Django setting (without the - ``readthedocs/build`` part) plus ``stable``, ``latest`` and ``testing``. - """ - images = {'stable', 'latest', 'testing'} - for k in settings.DOCKER_IMAGE_SETTINGS: - _, version = k.split(':') - if re.fullmatch(r'^[\d\.]+$', version): - images.add(version) - return images - - def get_valid_python_versions_for_image(self, build_image): - """ - Return all the valid Python versions for a Docker image. - - The Docker image (``build_image``) has to be its complete name, already - validated: ``readthedocs/build:4.0``, not just ``4.0``. - - Returns supported versions for the ``DOCKER_DEFAULT_VERSION`` if not - ``build_image`` found. - """ - if build_image not in settings.DOCKER_IMAGE_SETTINGS: - build_image = '{}:{}'.format( - constants_docker.DOCKER_DEFAULT_IMAGE, - self.default_build_image, - ) - return settings.DOCKER_IMAGE_SETTINGS[build_image]['python']['supported_versions'] - - def get_default_python_version_for_image(self, build_image, python_version): - """ - Return the default Python version for Docker image and Py2 or Py3. - - :param build_image: the Docker image complete name, already validated - (``readthedocs/build:4.0``, not just ``4.0``) - :type build_image: str - - :param python_version: major Python version (``2`` or ``3``) to get its - default full version - :type python_version: int - - :returns: default version for the ``DOCKER_DEFAULT_VERSION`` if not - ``build_image`` found. - """ - if build_image not in settings.DOCKER_IMAGE_SETTINGS: - build_image = '{}:{}'.format( - constants_docker.DOCKER_DEFAULT_IMAGE, - self.default_build_image, - ) - return ( - # For linting - settings.DOCKER_IMAGE_SETTINGS[build_image]['python'] - ['default_version'][python_version] - ) + return self.settings["os"][self.build.os] def as_dict(self): config = {} @@ -420,312 +328,6 @@ def __getattr__(self, name): raise ConfigOptionNotSupportedError(name) -class BuildConfigV1(BuildConfigBase): - - """Version 1 of the configuration file.""" - - PYTHON_INVALID_MESSAGE = '"python" section must be a mapping.' - PYTHON_EXTRA_REQUIREMENTS_INVALID_MESSAGE = ( - '"python.extra_requirements" section must be a list.' - ) - - version = '1' - - def get_valid_python_versions(self): - """ - Return all valid Python versions. - - .. note:: - - It does not take current build image used into account. - """ - try: - return self.env_config['python']['supported_versions'] - except (KeyError, TypeError): - versions = set() - for _, options in settings.DOCKER_IMAGE_SETTINGS.items(): - versions = versions.union( - options['python']['supported_versions'] - ) - return versions - - def get_valid_formats(self): # noqa - """Get all valid documentation formats.""" - return ( - 'htmlzip', - 'pdf', - 'epub', - ) - - def validate(self): - """ - Validate and process ``raw_config`` and ``env_config`` attributes. - - It makes sure that: - - - ``base`` is a valid directory and defaults to the directory of the - ``readthedocs.yml`` config file if not set - """ - # Validate env_config. - # Validate the build environment first - # Must happen before `validate_python`! - self._config['build'] = self.validate_build() - - # Validate raw_config. Order matters. - self._config['python'] = self.validate_python() - self._config['formats'] = self.validate_formats() - - self._config['conda'] = self.validate_conda() - self._config['requirements_file'] = self.validate_requirements_file() - - def validate_build(self): - """ - Validate the build config settings. - - This is a bit complex, - so here is the logic: - - * We take the default image & version if it's specific in the environment - * Then update the _version_ from the users config - * Then append the default _image_, since users can't change this - * Then update the env_config with the settings for that specific image - - This is currently used for a build image -> python version mapping - - This means we can use custom docker _images_, - but can't change the supported _versions_ that users have defined. - """ - # Defaults - if 'build' in self.env_config: - build = self.env_config['build'].copy() - else: - build = {'image': settings.DOCKER_IMAGE} - - # User specified - if 'build' in self._raw_config: - _build = self._raw_config['build'] - if 'image' in _build: - with self.catch_validation_error('build'): - build['image'] = validate_choice( - str(_build['image']), - self.valid_build_images, - ) - if ':' not in build['image']: - # Prepend proper image name to user's image name - build['image'] = '{}:{}'.format( - constants_docker.DOCKER_DEFAULT_IMAGE, - build['image'], - ) - # Update docker default settings from image name - if build['image'] in settings.DOCKER_IMAGE_SETTINGS: - self.env_config.update( - settings.DOCKER_IMAGE_SETTINGS[build['image']] - ) - - # Allow to override specific project - config_image = self.defaults.get('build_image') - if config_image: - build['image'] = config_image - return build - - def validate_python(self): - """Validates the ``python`` key, set default values it's necessary.""" - install_project = self.defaults.get('install_project', False) - version = self.defaults.get('python_version', '2') - python = { - 'install_with_pip': False, - 'extra_requirements': [], - 'install_with_setup': install_project, - 'version': version, - } - - if 'python' in self._raw_config: - raw_python = self._raw_config['python'] - if not isinstance(raw_python, dict): - self.error( - 'python', - self.PYTHON_INVALID_MESSAGE, - code=PYTHON_INVALID, - ) - - # Validate pip_install. - if 'pip_install' in raw_python: - with self.catch_validation_error('python.pip_install'): - python['install_with_pip'] = validate_bool( - raw_python['pip_install'], - ) - - # Validate extra_requirements. - if 'extra_requirements' in raw_python: - raw_extra_requirements = raw_python['extra_requirements'] - if not isinstance(raw_extra_requirements, list): - self.error( - 'python.extra_requirements', - self.PYTHON_EXTRA_REQUIREMENTS_INVALID_MESSAGE, - code=PYTHON_INVALID, - ) - if not python['install_with_pip']: - python['extra_requirements'] = [] - else: - for extra_name in raw_extra_requirements: - with self.catch_validation_error('python.extra_requirements'): - python['extra_requirements'].append( - validate_string(extra_name), - ) - - # Validate setup_py_install. - if 'setup_py_install' in raw_python: - with self.catch_validation_error('python.setup_py_install'): - python['install_with_setup'] = validate_bool( - raw_python['setup_py_install'], - ) - - if 'version' in raw_python: - with self.catch_validation_error('python.version'): - version = str(raw_python['version']) - python['version'] = validate_choice( - version, - self.get_valid_python_versions(), - ) - - return python - - def validate_conda(self): - """Validates the ``conda`` key.""" - conda = {} - - if 'conda' in self._raw_config: - raw_conda = self._raw_config['conda'] - with self.catch_validation_error('conda'): - validate_dict(raw_conda) - with self.catch_validation_error('conda.file'): - if 'file' not in raw_conda: - raise ValidationError('file', VALUE_NOT_FOUND) - conda_environment = validate_path( - raw_conda['file'], - self.base_path, - ) - conda['environment'] = conda_environment - return conda - return None - - def validate_requirements_file(self): - """Validates that the requirements file exists.""" - if 'requirements_file' not in self._raw_config: - requirements_file = self.defaults.get('requirements_file') - else: - requirements_file = self._raw_config['requirements_file'] - if not requirements_file: - return None - with self.catch_validation_error('requirements_file'): - requirements_file = validate_path( - requirements_file, - self.base_path, - ) - return requirements_file - - def validate_formats(self): - """Validates that formats contains only valid formats.""" - formats = self._raw_config.get('formats') - if formats is None: - return self.defaults.get('formats', []) - if formats == ['none']: - return [] - - with self.catch_validation_error('format'): - validate_list(formats) - for format_ in formats: - validate_choice(format_, self.get_valid_formats()) - - return formats - - @property - def formats(self): - """The documentation formats to be built.""" - return self._config['formats'] - - @property - def python(self): - """Python related configuration.""" - python = self._config['python'] - requirements = self._config['requirements_file'] - python_install = [] - - # Always append a `PythonInstallRequirements` option. - # If requirements is None, rtd will try to find a requirements file. - python_install.append( - PythonInstallRequirements( - requirements=requirements, - ), - ) - if python['install_with_pip']: - python_install.append( - PythonInstall( - path=self.base_path, - method=PIP, - extra_requirements=python['extra_requirements'], - ), - ) - elif python['install_with_setup']: - python_install.append( - PythonInstall( - path=self.base_path, - method=SETUPTOOLS, - extra_requirements=[], - ), - ) - - return Python( - version=python['version'], - install=python_install, - ) - - @property - def conda(self): - if self._config['conda'] is not None: - return Conda(**self._config['conda']) - return None - - @property - @lru_cache(maxsize=1) - def build(self): - """The docker image used by the builders.""" - return Build(**self._config['build']) - - @property - def doctype(self): - return self.defaults['doctype'] - - @property - def sphinx(self): - config_file = self.defaults['sphinx_configuration'] - if config_file is not None: - config_file = os.path.join(self.base_path, config_file) - return Sphinx( - builder=self.doctype, - configuration=config_file, - fail_on_warning=False, - ) - - @property - def mkdocs(self): - return Mkdocs( - configuration=None, - fail_on_warning=False, - ) - - @property - def submodules(self): - return Submodules( - include=ALL, - exclude=[], - recursive=True, - ) - - @property - def search(self): - return Search(ranking={}, ignore=[]) - - class BuildConfigV2(BuildConfigBase): """Version 2 of the configuration file.""" @@ -745,13 +347,7 @@ def settings(self): return settings.RTD_DOCKER_BUILD_SETTINGS def validate(self): - """ - Validates and process ``raw_config`` and ``env_config``. - - Sphinx is the default doc type to be built. We don't merge some values - from the database (like formats or python.version) to allow us set - default values. - """ + """Validates and process ``raw_config``.""" self._config['formats'] = self.validate_formats() self._config['conda'] = self.validate_conda() # This should be called before validate_python @@ -796,6 +392,7 @@ def validate_conda(self): conda['environment'] = validate_path(environment, self.base_path) return conda + # TODO: rename these methods to call them just `validate_build_config` def validate_build_config_with_os(self): """ Validates the build object (new format). @@ -877,31 +474,6 @@ def validate_build_config_with_os(self): build['apt_packages'] = self.validate_apt_packages() return build - def validate_old_build_config(self): - """ - Validates the build object (old format). - - It prioritizes the value from the default image if exists. - """ - build = {} - with self.catch_validation_error('build.image'): - image = self.pop_config('build.image', self.default_build_image) - build['image'] = '{}:{}'.format( - constants_docker.DOCKER_DEFAULT_IMAGE, - validate_choice( - image, - self.valid_build_images, - ), - ) - - # Allow to override specific project - config_image = self.defaults.get('build_image') - if config_image: - build['image'] = config_image - - build['apt_packages'] = self.validate_apt_packages() - return build - def validate_apt_packages(self): apt_packages = [] with self.catch_validation_error('build.apt_packages'): @@ -925,9 +497,7 @@ def validate_build(self): raw_build = self._raw_config.get('build', {}) with self.catch_validation_error('build'): validate_dict(raw_build) - if "os" in raw_build or "commands" in raw_build or "tools" in raw_build: - return self.validate_build_config_with_os() - return self.validate_old_build_config() + return self.validate_build_config_with_os() def validate_apt_package(self, index): """ @@ -977,35 +547,15 @@ def validate_python(self): validate_build should be called before this, since it initialize the build.image attribute. - Fall back to the defaults of: - - ``requirements`` - - ``install`` (only for setup.py method) - .. note:: - ``version`` can be a string or number type. - ``extra_requirements`` needs to be used with ``install: 'pip'``. - - If the new build config is used (``build.os``), - ``python.version`` shouldn't exist. """ raw_python = self._raw_config.get('python', {}) with self.catch_validation_error('python'): validate_dict(raw_python) python = {} - if not self.using_build_tools: - with self.catch_validation_error('python.version'): - version = self.pop_config('python.version', '3') - if version == 3.1: - # Special case for ``python.version: 3.10``, - # yaml will transform this to the numeric value of `3.1`. - # Save some frustration to users. - version = '3.10' - version = str(version) - python['version'] = validate_choice( - version, - self.get_valid_python_versions(), - ) - with self.catch_validation_error('python.install'): raw_install = self._raw_config.get('python', {}).get('install', []) validate_list(raw_install) @@ -1079,15 +629,6 @@ def validate_python_install(self, index): ) return python_install - def get_valid_python_versions(self): - """ - Get the valid python versions for the current docker image. - - This should be called after ``validate_build()``. - """ - build_image = self.build.image - return self.get_valid_python_versions_for_image(build_image) - def validate_doc_types(self): """ Validates that the user only have one type of documentation. @@ -1159,13 +700,8 @@ def validate_sphinx(self): sphinx['builder'] = self.valid_sphinx_builders[builder] with self.catch_validation_error('sphinx.configuration'): - configuration = self.defaults.get('sphinx_configuration') - # The default value can be empty - if not configuration: - configuration = None configuration = self.pop_config( 'sphinx.configuration', - configuration, ) if configuration is not None: configuration = validate_path(configuration, self.base_path) @@ -1326,22 +862,20 @@ def conda(self): @lru_cache(maxsize=1) def build(self): build = self._config['build'] - if 'os' in build: - tools = { - tool: BuildTool( - version=version, - full_version=self.settings['tools'][tool][version], - ) - for tool, version in build['tools'].items() - } - return BuildWithOs( - os=build['os'], - tools=tools, - jobs=BuildJobs(**build["jobs"]), - commands=build["commands"], - apt_packages=build["apt_packages"], + tools = { + tool: BuildTool( + version=version, + full_version=self.settings["tools"][tool][version], ) - return Build(**build) + for tool, version in build["tools"].items() + } + return BuildWithOs( + os=build["os"], + tools=tools, + jobs=BuildJobs(**build["jobs"]), + commands=build["commands"], + apt_packages=build["apt_packages"], + ) @property def python(self): @@ -1352,8 +886,8 @@ def python(self): python_install.append(PythonInstallRequirements(**install),) elif 'path' in install: python_install.append(PythonInstall(**install),) + return Python( - version=python.get('version'), install=python_install, ) @@ -1387,13 +921,11 @@ def search(self): return Search(**self._config['search']) -def load(path, env_config, readthedocs_yaml_path=None): +def load(path, readthedocs_yaml_path=None): """ Load a project configuration and the top-most build config for a given path. - That is usually the root of the project, but will look deeper. According to - the version of the configuration a build object would be load and validated, - ``BuildConfigV1`` is the default. + That is usually the root of the project, but will look deeper. """ # Custom non-default config file location if readthedocs_yaml_path: @@ -1406,9 +938,7 @@ def load(path, env_config, readthedocs_yaml_path=None): else: filename = find_one(path, CONFIG_FILENAME_REGEX) if not filename: - # This exception is current caught higher up and will result in an attempt - # to load the v1 config schema. - raise DefaultConfigFileNotFound(path) + raise DefaultConfigFileNotFound() # Allow symlinks, but only the ones that resolve inside the base directory. with safe_open( @@ -1424,9 +954,8 @@ def load(path, env_config, readthedocs_yaml_path=None): ), code=CONFIG_SYNTAX_INVALID, ) from error - version = config.get('version', 1) + version = config.get("version", 2) build_config = get_configuration_class(version)( - env_config, config, source_file=filename, ) @@ -1442,7 +971,6 @@ def get_configuration_class(version): :type version: str or int """ configurations_class = { - 1: BuildConfigV1, 2: BuildConfigV2, } try: diff --git a/readthedocs/config/models.py b/readthedocs/config/models.py index cbb27b854a8..237c00f388e 100644 --- a/readthedocs/config/models.py +++ b/readthedocs/config/models.py @@ -26,15 +26,7 @@ def as_dict(self): } -class Build(Base): - - __slots__ = ('image', 'apt_packages') - - def __init__(self, **kwargs): - kwargs.setdefault('apt_packages', []) - super().__init__(**kwargs) - - +# TODO: rename this class to `Build` class BuildWithOs(Base): __slots__ = ("os", "tools", "jobs", "apt_packages", "commands") @@ -80,7 +72,7 @@ def __init__(self, **kwargs): class Python(Base): - __slots__ = ("version", "install") + __slots__ = ("install",) class PythonInstallRequirements(Base): diff --git a/readthedocs/config/tests/test_config.py b/readthedocs/config/tests/test_config.py index a836e78ffad..8e73b6d8a9a 100644 --- a/readthedocs/config/tests/test_config.py +++ b/readthedocs/config/tests/test_config.py @@ -2,7 +2,6 @@ import re import textwrap from collections import OrderedDict -from unittest.mock import DEFAULT, patch import pytest from django.conf import settings @@ -13,10 +12,8 @@ ALL, PIP, SETUPTOOLS, - BuildConfigV1, BuildConfigV2, ConfigError, - ConfigOptionNotSupportedError, DefaultConfigFileNotFound, InvalidConfig, load, @@ -24,48 +21,44 @@ from readthedocs.config.config import ( CONFIG_FILE_REQUIRED, CONFIG_FILENAME_REGEX, - CONFIG_NOT_SUPPORTED, CONFIG_REQUIRED, CONFIG_SYNTAX_INVALID, INVALID_KEY, INVALID_NAME, - PYTHON_INVALID, VERSION_INVALID, ) from readthedocs.config.models import ( - Build, BuildJobs, BuildWithOs, - Conda, PythonInstall, PythonInstallRequirements, ) -from readthedocs.config.validation import ( - INVALID_BOOL, - INVALID_CHOICE, - INVALID_LIST, - VALUE_NOT_FOUND, - ValidationError, -) +from readthedocs.config.validation import VALUE_NOT_FOUND, ValidationError from .utils import apply_fs -yaml_config_dir = { - 'readthedocs.yml': textwrap.dedent( - ''' - formats: - - pdf - ''' - ), -} +def get_build_config(config, source_file="readthedocs.yml", validate=False): + # I'm adding these defaults here to avoid modifying all the config file from all the tests + final_config = { + "version": "2", + "build": { + "os": "ubuntu-22.04", + "tools": { + "python": "3", + }, + }, + } + final_config.update(config) -def get_build_config(config, env_config=None, source_file='readthedocs.yml'): - return BuildConfigV1( - env_config or {}, - config, + build_config = BuildConfigV2( + final_config, source_file=source_file, ) + if validate: + build_config.validate() + + return build_config @pytest.mark.parametrize( @@ -98,34 +91,19 @@ def test_load_empty_config_file(tmpdir): load(base, {}) -def test_minimal_config(tmpdir): - apply_fs(tmpdir, yaml_config_dir) - base = str(tmpdir) - with override_settings(DOCROOT=tmpdir): - build = load(base, {}) - assert isinstance(build, BuildConfigV1) - - -def test_load_version1(tmpdir): - apply_fs( - tmpdir, { - 'readthedocs.yml': textwrap.dedent(''' - version: 1 - '''), - }, - ) - base = str(tmpdir) - with override_settings(DOCROOT=tmpdir): - build = load(base, {}) - assert isinstance(build, BuildConfigV1) - - def test_load_version2(tmpdir): apply_fs( - tmpdir, { - 'readthedocs.yml': textwrap.dedent(''' + tmpdir, + { + "readthedocs.yml": textwrap.dedent( + """ version: 2 - '''), + build: + os: "ubuntu-22.04" + tools: + python: "3" + """ + ), }, ) base = str(tmpdir) @@ -171,31 +149,6 @@ def test_load_raise_exception_invalid_syntax(tmpdir): assert excinfo.value.code == CONFIG_SYNTAX_INVALID -def test_yaml_extension(tmpdir): - """Make sure loading the 'readthedocs' file with a 'yaml' extension.""" - apply_fs( - tmpdir, { - 'readthedocs.yaml': textwrap.dedent( - ''' - python: - version: 3 - ''' - ), - }, - ) - base = str(tmpdir) - with override_settings(DOCROOT=tmpdir): - config = load(base, {}) - assert isinstance(config, BuildConfigV1) - - -def test_build_config_has_source_file(tmpdir): - base = str(apply_fs(tmpdir, yaml_config_dir)) - with override_settings(DOCROOT=tmpdir): - build = load(base, {}) - assert build.source_file == os.path.join(base, 'readthedocs.yml') - - def test_load_non_default_filename(tmpdir): """ Load a config file name with a non-default name. @@ -215,6 +168,10 @@ def test_load_non_default_filename(tmpdir): non_default_filename: textwrap.dedent( """ version: 2 + build: + os: "ubuntu-22.04" + tools: + python: "3" """ ), ".readthedocs.yaml": "illegal syntax but should not load", @@ -222,7 +179,7 @@ def test_load_non_default_filename(tmpdir): ) base = str(tmpdir) with override_settings(DOCROOT=tmpdir): - build = load(base, {}, readthedocs_yaml_path="myconfig.yaml") + build = load(base, readthedocs_yaml_path="myconfig.yaml") assert isinstance(build, BuildConfigV2) assert build.source_file == os.path.join(base, non_default_filename) @@ -244,6 +201,10 @@ def test_load_non_yaml_extension(tmpdir): non_default_filename: textwrap.dedent( """ version: 2 + build: + os: "ubuntu-22.04" + tools: + python: "3" """ ), }, @@ -252,469 +213,11 @@ def test_load_non_yaml_extension(tmpdir): ) base = str(tmpdir) with override_settings(DOCROOT=tmpdir): - build = load(base, {}, readthedocs_yaml_path="subdir/.readthedocs.skrammel") + build = load(base, readthedocs_yaml_path="subdir/.readthedocs.skrammel") assert isinstance(build, BuildConfigV2) assert build.source_file == os.path.join(base, "subdir/.readthedocs.skrammel") -def test_build_config_has_list_with_single_empty_value(tmpdir): - base = str(apply_fs( - tmpdir, { - 'readthedocs.yml': textwrap.dedent( - ''' - formats: [] - ''' - ), - }, - )) - with override_settings(DOCROOT=tmpdir): - build = load(base, {}) - assert isinstance(build, BuildConfigV1) - assert build.formats == [] - - -def test_version(): - build = get_build_config({}) - assert build.version == '1' - - -def test_doc_type(): - build = get_build_config( - {}, - { - 'defaults': { - 'doctype': 'sphinx', - }, - }, - ) - build.validate() - assert build.doctype == 'sphinx' - - -def test_empty_python_section_is_valid(): - build = get_build_config({'python': {}}) - build.validate() - assert build.python - - -def test_python_section_must_be_dict(): - build = get_build_config({'python': 123}) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'python' - assert excinfo.value.code == PYTHON_INVALID - - -class TestValidatePythonExtraRequirements: - - def test_it_defaults_to_install_requirements_as_none(self): - build = get_build_config({'python': {}}) - build.validate() - install = build.python.install - assert len(install) == 1 - assert isinstance(install[0], PythonInstallRequirements) - assert install[0].requirements is None - - def test_it_validates_is_a_list(self): - build = get_build_config( - {'python': {'extra_requirements': 'invalid'}}, - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'python.extra_requirements' - assert excinfo.value.code == PYTHON_INVALID - - @patch('readthedocs.config.config.validate_string') - def test_it_uses_validate_string(self, validate_string): - validate_string.return_value = True - build = get_build_config( - { - 'python': { - 'pip_install': True, - 'extra_requirements': ['tests'], - }, - }, - ) - build.validate() - validate_string.assert_any_call('tests') - - -class TestValidateSetupPyInstall: - - def test_it_defaults_to_false(self): - build = get_build_config({'python': {}}) - build.validate() - install = build.python.install - assert len(install) == 1 - assert isinstance(install[0], PythonInstallRequirements) - assert install[0].requirements is None - - def test_it_validates_value(self): - build = get_build_config( - {'python': {'setup_py_install': 'this-is-string'}}, - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'python.setup_py_install' - assert excinfo.value.code == INVALID_BOOL - - @patch('readthedocs.config.config.validate_bool') - def test_it_uses_validate_bool(self, validate_bool): - validate_bool.return_value = True - build = get_build_config( - {'python': {'setup_py_install': 'to-validate'}}, - ) - build.validate() - validate_bool.assert_any_call('to-validate') - - -class TestValidatePythonVersion: - - def test_it_defaults_to_a_valid_version(self): - build = get_build_config({'python': {}}) - build.validate() - assert build.python.version == '2' - assert build.python_interpreter == 'python2.7' - assert build.python_full_version == '2.7' - - def test_it_supports_other_versions(self): - build = get_build_config( - {'python': {'version': 3.7}}, - ) - build.validate() - assert build.python.version == '3.7' - assert build.python_interpreter == 'python3.7' - assert build.python_full_version == '3.7' - - def test_it_validates_versions_out_of_range(self): - build = get_build_config( - {'python': {'version': 1.0}}, - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'python.version' - assert excinfo.value.code == INVALID_CHOICE - - def test_it_validates_wrong_type(self): - build = get_build_config( - {'python': {'version': 'this-is-string'}}, - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'python.version' - assert excinfo.value.code == INVALID_CHOICE - - def test_it_validates_env_supported_versions(self): - build = get_build_config( - {'python': {'version': '3.6'}}, - env_config={ - 'python': {'supported_versions': ['3.5']}, - 'build': {'image': 'custom'}, - }, - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'python.version' - assert excinfo.value.code == INVALID_CHOICE - - build = get_build_config( - {'python': {'version': '3.6'}}, - env_config={ - 'python': {'supported_versions': ['3.5', '3.6']}, - 'build': {'image': 'custom'}, - }, - ) - build.validate() - assert build.python.version == '3.6' - assert build.python_interpreter == 'python3.6' - assert build.python_full_version == '3.6' - - @pytest.mark.parametrize('value', ['2', '3']) - def test_it_respects_default_value(self, value): - defaults = { - 'python_version': value, - } - build = get_build_config( - {}, - {'defaults': defaults}, - ) - build.validate() - assert build.python.version == value - - -class TestValidateFormats: - - def test_it_defaults_to_empty(self): - build = get_build_config({}) - build.validate() - assert build.formats == [] - - def test_it_gets_set_correctly(self): - build = get_build_config({'formats': ['pdf']}) - build.validate() - assert build.formats == ['pdf'] - - def test_formats_can_be_null(self): - build = get_build_config({'formats': None}) - build.validate() - assert build.formats == [] - - def test_formats_with_previous_none(self): - build = get_build_config({'formats': ['none']}) - build.validate() - assert build.formats == [] - - def test_formats_can_be_empty(self): - build = get_build_config({'formats': []}) - build.validate() - assert build.formats == [] - - def test_all_valid_formats(self): - build = get_build_config( - {'formats': ['pdf', 'htmlzip', 'epub']}, - ) - build.validate() - assert build.formats == ['pdf', 'htmlzip', 'epub'] - - def test_cant_have_none_as_format(self): - build = get_build_config( - {'formats': ['htmlzip', None]}, - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'format' - assert excinfo.value.code == INVALID_CHOICE - - def test_formats_have_only_allowed_values(self): - build = get_build_config( - {'formats': ['htmlzip', 'csv']}, - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'format' - assert excinfo.value.code == INVALID_CHOICE - - def test_only_list_type(self): - build = get_build_config({'formats': 'no-list'}) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'format' - assert excinfo.value.code == INVALID_LIST - - -def test_valid_build_config(): - build = BuildConfigV1( - {}, - {}, - source_file='readthedocs.yml', - ) - build.validate() - assert build.python - assert len(build.python.install) == 1 - assert isinstance(build.python.install[0], PythonInstallRequirements) - assert build.python.install[0].requirements is None - - -class TestValidateBuild: - - def test_it_fails_if_build_is_invalid_option(self, tmpdir): - apply_fs(tmpdir, yaml_config_dir) - build = BuildConfigV1( - {}, - {'build': {'image': 3.2}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'build' - assert excinfo.value.code == INVALID_CHOICE - - def test_it_fails_on_python_validation(self, tmpdir): - apply_fs(tmpdir, yaml_config_dir) - build = BuildConfigV1( - {}, - { - 'build': {'image': 2.0}, - 'python': {'version': '3.8'}, - }, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate_build() - with raises(InvalidConfig) as excinfo: - build.validate_python() - assert excinfo.value.key == 'python.version' - assert excinfo.value.code == INVALID_CHOICE - - def test_it_works_on_python_validation(self, tmpdir): - apply_fs(tmpdir, yaml_config_dir) - build = BuildConfigV1( - {}, - { - 'build': {'image': 'latest'}, - 'python': {'version': '3.6'}, - }, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate_build() - build.validate_python() - - def test_it_works(self, tmpdir): - apply_fs(tmpdir, yaml_config_dir) - build = BuildConfigV1( - {}, - {'build': {'image': 'latest'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - assert build.build.image == 'readthedocs/build:latest' - - def test_default(self, tmpdir): - apply_fs(tmpdir, yaml_config_dir) - build = BuildConfigV1( - {}, - {}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - assert build.build.image == 'readthedocs/build:latest' - - @pytest.mark.parametrize( - 'image', ['latest', 'readthedocs/build:3.0', 'rtd/build:latest'], - ) - def test_it_priorities_image_from_env_config(self, tmpdir, image): - apply_fs(tmpdir, yaml_config_dir) - defaults = { - 'build_image': image, - } - build = BuildConfigV1( - {'defaults': defaults}, - {'build': {'image': 'latest'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - assert build.build.image == image - - -def test_use_conda_default_none(): - build = get_build_config({}) - build.validate() - assert build.conda is None - - -def test_validates_conda_file(tmpdir): - apply_fs(tmpdir, {'environment.yml': ''}) - build = get_build_config( - {'conda': {'file': 'environment.yml'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - assert isinstance(build.conda, Conda) - assert build.conda.environment == 'environment.yml' - - -def test_file_is_required_when_using_conda(tmpdir): - apply_fs(tmpdir, {'environment.yml': ''}) - build = get_build_config( - {'conda': {'foo': 'environment.yml'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'conda.file' - assert excinfo.value.code == VALUE_NOT_FOUND - - -def test_requirements_file_empty(): - build = get_build_config({}) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].requirements is None - - -def test_requirements_file_repects_default_value(tmpdir): - apply_fs(tmpdir, {'myrequirements.txt': ''}) - defaults = { - 'requirements_file': 'myrequirements.txt', - } - build = get_build_config( - {}, - {'defaults': defaults}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].requirements == 'myrequirements.txt' - - -def test_requirements_file_respects_configuration(tmpdir): - apply_fs(tmpdir, {'requirements.txt': ''}) - build = get_build_config( - {'requirements_file': 'requirements.txt'}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].requirements == 'requirements.txt' - - -def test_requirements_file_is_null(tmpdir): - build = get_build_config( - {'requirements_file': None}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].requirements is None - - -def test_requirements_file_is_blank(tmpdir): - build = get_build_config( - {'requirements_file': ''}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].requirements is None - - -def test_build_validate_calls_all_subvalidators(tmpdir): - apply_fs(tmpdir, {}) - build = BuildConfigV1( - {}, - {}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - with patch.multiple( - BuildConfigV1, - validate_python=DEFAULT, - ): - build.validate() - BuildConfigV1.validate_python.assert_called_with() - - -def test_load_calls_validate(tmpdir): - apply_fs(tmpdir, yaml_config_dir) - base = str(tmpdir) - with patch.object(BuildConfigV1, 'validate') as build_validate: - with override_settings(DOCROOT=tmpdir): - load(base, {}) - assert build_validate.call_count == 1 - - -def test_raise_config_not_supported(): - build = get_build_config({}) - build.validate() - with raises(ConfigOptionNotSupportedError) as excinfo: - build.redirects - assert excinfo.value.configuration == 'redirects' - assert excinfo.value.code == CONFIG_NOT_SUPPORTED - - @pytest.mark.parametrize( 'correct_config_filename', [prefix + 'readthedocs.' + extension for prefix in {'', '.'} @@ -724,319 +227,212 @@ def test_config_filenames_regex(correct_config_filename): assert re.match(CONFIG_FILENAME_REGEX, correct_config_filename) -def test_as_dict(tmpdir): - apply_fs(tmpdir, {'requirements.txt': ''}) - build = get_build_config( - { - 'version': 1, - 'formats': ['pdf'], - 'python': { - 'version': 3.7, - }, - 'requirements_file': 'requirements.txt', - }, - { - 'defaults': { - 'doctype': 'sphinx', - 'sphinx_configuration': None, - }, - }, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - expected_dict = { - 'version': '1', - 'formats': ['pdf'], - 'python': { - 'version': '3.7', - 'install': [{ - 'requirements': 'requirements.txt', - }], - }, - 'build': { - 'image': 'readthedocs/build:latest', - 'apt_packages': [], - }, - 'conda': None, - 'sphinx': { - 'builder': 'sphinx', - 'configuration': None, - 'fail_on_warning': False, - }, - 'mkdocs': { - 'configuration': None, - 'fail_on_warning': False, - }, - 'doctype': 'sphinx', - 'submodules': { - 'include': ALL, - 'exclude': [], - 'recursive': True, - }, - 'search': { - 'ranking': {}, - 'ignore': [], - }, - } - assert build.as_dict() == expected_dict - - class TestBuildConfigV2: - - def get_build_config( - self, config, env_config=None, source_file='readthedocs.yml', - ): - return BuildConfigV2( - env_config or {}, - config, - source_file=source_file, - ) - def test_version(self): - build = self.get_build_config({}) - assert build.version == '2' + build = get_build_config({}) + assert build.version == "2" def test_correct_error_when_source_is_dir(self, tmpdir): - build = self.get_build_config({}, source_file=str(tmpdir)) + build = get_build_config({}, source_file=str(tmpdir)) with raises(InvalidConfig) as excinfo: - build.error(key='key', message='Message', code='code') + build.error(key="key", message="Message", code="code") # We don't have any extra information about # the source_file. assert str(excinfo.value) == 'Invalid configuration option "key": Message' def test_formats_check_valid(self): - build = self.get_build_config({'formats': ['htmlzip', 'pdf', 'epub']}) + build = get_build_config({"formats": ["htmlzip", "pdf", "epub"]}) build.validate() - assert build.formats == ['htmlzip', 'pdf', 'epub'] + assert build.formats == ["htmlzip", "pdf", "epub"] - @pytest.mark.parametrize('value', [3, 'invalid', {'other': 'value'}]) + @pytest.mark.parametrize("value", [3, "invalid", {"other": "value"}]) def test_formats_check_invalid_value(self, value): - build = self.get_build_config({'formats': value}) + build = get_build_config({"formats": value}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'formats' + assert excinfo.value.key == "formats" def test_formats_check_invalid_type(self): - build = self.get_build_config( - {'formats': ['htmlzip', 'invalid', 'epub']}, + build = get_build_config( + {"formats": ["htmlzip", "invalid", "epub"]}, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'formats' + assert excinfo.value.key == "formats" def test_formats_default_value(self): - build = self.get_build_config({}) + build = get_build_config({}) build.validate() assert build.formats == [] + # TODO: remove/adapt all these tests that use "defaults". + # I'm removing them from the code since we don't need them anymore. def test_formats_overrides_default_values(self): - build = self.get_build_config( + build = get_build_config( {}, - {'defaults': {'formats': ['htmlzip']}}, ) build.validate() assert build.formats == [] def test_formats_priority_over_defaults(self): - build = self.get_build_config( - {'formats': []}, - {'defaults': {'formats': ['htmlzip']}}, + build = get_build_config( + {"formats": []}, ) build.validate() assert build.formats == [] - build = self.get_build_config( - {'formats': ['pdf']}, - {'defaults': {'formats': ['htmlzip']}}, + build = get_build_config( + {"formats": ["pdf"]}, ) build.validate() - assert build.formats == ['pdf'] + assert build.formats == ["pdf"] def test_formats_allow_empty(self): - build = self.get_build_config({'formats': []}) + build = get_build_config({"formats": []}) build.validate() assert build.formats == [] def test_formats_allow_all_keyword(self): - build = self.get_build_config({'formats': 'all'}) + build = get_build_config({"formats": "all"}) build.validate() - assert build.formats == ['htmlzip', 'pdf', 'epub'] + assert build.formats == ["htmlzip", "pdf", "epub"] def test_conda_check_valid(self, tmpdir): - apply_fs(tmpdir, {'environment.yml': ''}) - build = self.get_build_config( - {'conda': {'environment': 'environment.yml'}}, - source_file=str(tmpdir.join('readthedocs.yml')), + apply_fs(tmpdir, {"environment.yml": ""}) + build = get_build_config( + {"conda": {"environment": "environment.yml"}}, + source_file=str(tmpdir.join("readthedocs.yml")), ) build.validate() - assert build.conda.environment == 'environment.yml' + assert build.conda.environment == "environment.yml" - @pytest.mark.parametrize('value', [3, [], 'invalid']) + @pytest.mark.parametrize("value", [3, [], "invalid"]) def test_conda_check_invalid_value(self, value): - build = self.get_build_config({'conda': value}) + build = get_build_config({"conda": value}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'conda' + assert excinfo.value.key == "conda" - @pytest.mark.parametrize('value', [3, [], {}]) + @pytest.mark.parametrize("value", [3, [], {}]) def test_conda_check_invalid_file_value(self, value): - build = self.get_build_config({'conda': {'file': value}}) + build = get_build_config({"conda": {"file": value}}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'conda.environment' + assert excinfo.value.key == "conda.environment" def test_conda_check_file_required(self): - build = self.get_build_config({'conda': {'no-file': 'other'}}) + build = get_build_config({"conda": {"no-file": "other"}}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'conda.environment' - - @pytest.mark.parametrize('value', ['stable', 'latest', 'testing']) - def test_build_image_check_valid(self, value): - build = self.get_build_config({'build': {'image': value}}) - build.validate() - assert build.build.image == 'readthedocs/build:{}'.format(value) + assert excinfo.value.key == "conda.environment" - @pytest.mark.parametrize('value', ['readthedocs/build:latest', 'one']) - def test_build_image_check_invalid(self, value): - build = self.get_build_config({'build': {'image': value}}) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'build.image' - - @pytest.mark.parametrize( - 'image', ['latest', 'readthedocs/build:3.0', 'rtd/build:latest'], - ) - def test_build_image_priorities_default(self, image): - build = self.get_build_config( - {'build': {'image': 'latest'}}, - {'defaults': {'build_image': image}}, - ) - build.validate() - assert build.build.image == image - - @pytest.mark.parametrize('image', ['', None]) - def test_build_image_over_empty_default(self, image): - build = self.get_build_config( - {'build': {'image': 'latest'}}, - {'defaults': {'build_image': image}}, - ) - build.validate() - assert build.build.image == 'readthedocs/build:latest' - - def test_build_image_default_value(self): - build = self.get_build_config({}) - build.validate() - assert not build.using_build_tools - assert isinstance(build.build, Build) - assert build.build.image == 'readthedocs/build:latest' - - @pytest.mark.parametrize('value', [3, [], 'invalid']) + @pytest.mark.parametrize("value", [3, [], "invalid"]) def test_build_check_invalid_type(self, value): - build = self.get_build_config({'build': value}) + build = get_build_config({"build": value}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'build' + assert excinfo.value.key == "build" - @pytest.mark.parametrize('value', [3, [], {}]) + @pytest.mark.parametrize("value", [3, [], {}]) def test_build_image_check_invalid_type(self, value): - build = self.get_build_config({'build': {'image': value}}) + build = get_build_config({"build": {"image": value}}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'build.image' + assert excinfo.value.key == "build.os" - @pytest.mark.parametrize('value', ['', None, 'latest']) + @pytest.mark.parametrize("value", ["", None, "latest"]) def test_new_build_config_invalid_os(self, value): - build = self.get_build_config( + build = get_build_config( { - 'build': { - 'os': value, - 'tools': {'python': '3'}, + "build": { + "os": value, + "tools": {"python": "3"}, }, }, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'build.os' + assert excinfo.value.key == "build.os" - @pytest.mark.parametrize('value', ['', None, 'python', ['python', 'nodejs'], {}, {'cobol': '99'}]) + @pytest.mark.parametrize( + "value", ["", None, "python", ["python", "nodejs"], {}, {"cobol": "99"}] + ) def test_new_build_config_invalid_tools(self, value): - build = self.get_build_config( + build = get_build_config( { - 'build': { - 'os': 'ubuntu-20.04', - 'tools': value, + "build": { + "os": "ubuntu-20.04", + "tools": value, }, }, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'build.tools' + assert excinfo.value.key == "build.tools" def test_new_build_config_invalid_tools_version(self): - build = self.get_build_config( + build = get_build_config( { - 'build': { - 'os': 'ubuntu-20.04', - 'tools': {'python': '2.6'}, + "build": { + "os": "ubuntu-20.04", + "tools": {"python": "2.6"}, }, }, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'build.tools.python' + assert excinfo.value.key == "build.tools.python" def test_new_build_config(self): - build = self.get_build_config( + build = get_build_config( { - 'build': { - 'os': 'ubuntu-20.04', - 'tools': {'python': '3.9'}, + "build": { + "os": "ubuntu-20.04", + "tools": {"python": "3.9"}, }, }, ) build.validate() - assert build.using_build_tools assert isinstance(build.build, BuildWithOs) - assert build.build.os == 'ubuntu-20.04' - assert build.build.tools['python'].version == '3.9' - full_version = settings.RTD_DOCKER_BUILD_SETTINGS['tools']['python']['3.9'] - assert build.build.tools['python'].full_version == full_version - assert build.python_interpreter == 'python' + assert build.build.os == "ubuntu-20.04" + assert build.build.tools["python"].version == "3.9" + full_version = settings.RTD_DOCKER_BUILD_SETTINGS["tools"]["python"]["3.9"] + assert build.build.tools["python"].full_version == full_version + assert build.python_interpreter == "python" def test_new_build_config_conflict_with_build_image(self): - build = self.get_build_config( + build = get_build_config( { - 'build': { - 'image': 'latest', - 'os': 'ubuntu-20.04', - 'tools': {'python': '3.9'}, + "build": { + "image": "latest", + "os": "ubuntu-20.04", + "tools": {"python": "3.9"}, }, }, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'build.image' + assert excinfo.value.key == "build.image" def test_new_build_config_conflict_with_build_python_version(self): - build = self.get_build_config( + build = get_build_config( { - 'build': { - 'os': 'ubuntu-20.04', - 'tools': {'python': '3.8'}, + "build": { + "os": "ubuntu-20.04", + "tools": {"python": "3.8"}, }, - 'python': {'version': '3.8'}, + "python": {"version": "3.8"}, }, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.version' + assert excinfo.value.key == "python.version" def test_commands_build_config_tools_and_commands_valid(self): """ Test that build.tools and build.commands are valid together. """ - build = self.get_build_config( + build = get_build_config( { "build": { "os": "ubuntu-20.04", @@ -1053,7 +449,7 @@ def test_build_jobs_without_build_os_is_invalid(self): """ build.jobs can't be used without build.os """ - build = self.get_build_config( + build = get_build_config( { "build": { "tools": {"python": "3.8"}, @@ -1068,7 +464,7 @@ def test_build_jobs_without_build_os_is_invalid(self): assert excinfo.value.key == "build.os" def test_commands_build_config_invalid_command(self): - build = self.get_build_config( + build = get_build_config( { "build": { "os": "ubuntu-20.04", @@ -1082,7 +478,7 @@ def test_commands_build_config_invalid_command(self): assert excinfo.value.key == "build.commands" def test_commands_build_config_invalid_no_os(self): - build = self.get_build_config( + build = get_build_config( { "build": { "commands": ["pip install pelican", "pelican content"], @@ -1095,7 +491,7 @@ def test_commands_build_config_invalid_no_os(self): def test_commands_build_config_valid(self): """It's valid to build with just build.os and build.commands.""" - build = self.get_build_config( + build = get_build_config( { "build": { "os": "ubuntu-22.04", @@ -1111,7 +507,7 @@ def test_commands_build_config_valid(self): @pytest.mark.parametrize("value", ["", None, "pre_invalid"]) def test_jobs_build_config_invalid_jobs(self, value): - build = self.get_build_config( + build = get_build_config( { "build": { "os": "ubuntu-20.04", @@ -1126,7 +522,7 @@ def test_jobs_build_config_invalid_jobs(self, value): @pytest.mark.parametrize("value", ["", None, "echo 123", 42]) def test_jobs_build_config_invalid_job_commands(self, value): - build = self.get_build_config( + build = get_build_config( { "build": { "os": "ubuntu-20.04", @@ -1142,7 +538,7 @@ def test_jobs_build_config_invalid_job_commands(self, value): assert excinfo.value.key == "build.jobs.pre_install" def test_jobs_build_config(self): - build = self.get_build_config( + build = get_build_config( { "build": { "os": "ubuntu-20.04", @@ -1194,27 +590,44 @@ def test_jobs_build_config(self): 'value', [ [], - ['cmatrix'], - ['Mysql', 'cmatrix', 'postgresql-dev'], + ["cmatrix"], + ["Mysql", "cmatrix", "postgresql-dev"], ], ) def test_build_apt_packages_check_valid(self, value): - build = self.get_build_config({'build': {'apt_packages': value}}) + build = get_build_config( + { + "build": { + "os": "ubuntu-22.04", + "tools": {"python": "3"}, + "apt_packages": value, + } + } + ) build.validate() + assert build.build.apt_packages == value @pytest.mark.parametrize( - 'value', - [3, 'string', {}], + "value", + [3, "string", {}], ) def test_build_apt_packages_invalid_type(self, value): - build = self.get_build_config({'build': {'apt_packages': value}}) + build = get_build_config( + { + "build": { + "os": "ubuntu-22.04", + "tools": {"python": "3"}, + "apt_packages": value, + } + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'build.apt_packages' + assert excinfo.value.key == "build.apt_packages" @pytest.mark.parametrize( - 'error_index, value', + "error_index, value", [ (0, ['/', 'cmatrix']), (1, ['cmatrix', '-q']), @@ -1234,165 +647,55 @@ def test_build_apt_packages_invalid_type(self, value): (1, ['mysql', 'cmatrix$']), (0, ['^mysql-*', 'cmatrix$']), # We don't allow specifying versions for now. - (0, ['postgresql=1.2.3']), + (0, ["postgresql=1.2.3"]), # We don't allow specifying distributions for now. - (0, ['cmatrix/bionic']), + (0, ["cmatrix/bionic"]), ], ) def test_build_apt_packages_invalid_value(self, error_index, value): - build = self.get_build_config({'build': {'apt_packages': value}}) + build = get_build_config( + { + "build": { + "os": "ubuntu-22.04", + "tools": {"python": "3"}, + "apt_packages": value, + } + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == f'build.apt_packages.{error_index}' + assert excinfo.value.key == f"build.apt_packages.{error_index}" assert excinfo.value.code == INVALID_NAME - @pytest.mark.parametrize('value', [3, [], 'invalid']) + @pytest.mark.parametrize("value", [3, [], "invalid"]) def test_python_check_invalid_types(self, value): - build = self.get_build_config({'python': value}) + build = get_build_config({"python": value}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python' - - @pytest.mark.parametrize( - 'image,versions', - [ - ("latest", ["2", "2.7", "3", "3.5", "3.6", "3.7"]), - ("stable", ["2", "2.7", "3", "3.5", "3.6", "3.7"]), - ], - ) - def test_python_version(self, image, versions): - for version in versions: - build = self.get_build_config({ - 'build': { - 'image': image, - }, - 'python': { - 'version': version, - }, - }) - build.validate() - assert build.python.version == version - - def test_python_version_accepts_string(self): - build = self.get_build_config({ - 'build': { - 'image': 'latest', - }, - 'python': { - 'version': '3.6', - }, - }) - build.validate() - assert build.python.version == '3.6' - - def test_python_version_accepts_number(self): - build = self.get_build_config({ - 'build': { - 'image': 'latest', - }, - 'python': { - 'version': 3.6, - }, - }) - build.validate() - assert build.python.version == '3.6' - - def test_python_version_310_as_number(self): - build = self.get_build_config({ - 'build': { - 'image': 'testing', - }, - 'python': { - 'version': 3.10, - }, - }) - build.validate() - assert build.python.version == '3.10' + assert excinfo.value.key == "python" - @pytest.mark.parametrize( - 'image,versions', - [ - ('latest', [1, 2.8, 4]), - ('stable', [1, 2.8, 4]), - ], - ) - def test_python_version_invalid(self, image, versions): - for version in versions: - build = self.get_build_config({ - 'build': { - 'image': image, - }, - 'python': { - 'version': version, - }, - }) - with raises(InvalidConfig) as excinfo: - build.validate() - assert excinfo.value.key == 'python.version' - - def test_python_version_default(self): - build = self.get_build_config({}) - build.validate() - assert build.python.version == '3' - - @pytest.mark.parametrize( - 'image, default_version, full_version', - [ - ('2.0', '3', '3.5'), - ('4.0', '3', '3.7'), - ('5.0', '3', '3.7'), - ('latest', '3', '3.7'), - ('stable', '3', '3.7'), - ], - ) - def test_python_version_default_from_image(self, image, default_version, full_version): - build = self.get_build_config({ - 'build': { - 'image': image, - }, - }) - build.validate() - assert build.python.version == default_version - assert build.python_full_version == full_version - - @pytest.mark.parametrize('value', [2, 3]) - def test_python_version_overrides_default(self, value): - build = self.get_build_config( - {}, - {'defaults': {'python_version': value}}, - ) - build.validate() - assert build.python.version == '3' - - @pytest.mark.parametrize('value', ['2', '3', '3.6']) - def test_python_version_priority_over_default(self, value): - build = self.get_build_config( - {'python': {'version': value}}, - {'defaults': {'python_version': '3'}}, - ) - build.validate() - assert build.python.version == value - - @pytest.mark.parametrize('value', [[], {}]) + @pytest.mark.parametrize("value", [[], {}, "3", "3.10"]) def test_python_version_check_invalid_types(self, value): - build = self.get_build_config({'python': {'version': value}}) + build = get_build_config({"python": {"version": value}}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.version' + assert excinfo.value.key == "python.version" def test_python_install_default_value(self): - build = self.get_build_config({}) + build = get_build_config({}) build.validate() install = build.python.install assert len(install) == 0 def test_python_install_check_default(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - }], + "python": { + "install": [ + { + "path": ".", + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), @@ -1401,35 +704,35 @@ def test_python_install_check_default(self, tmpdir): install = build.python.install assert len(install) == 1 assert isinstance(install[0], PythonInstall) - assert install[0].path == '.' + assert install[0].path == "." assert install[0].method == PIP assert install[0].extra_requirements == [] - @pytest.mark.parametrize('value', ['invalid', 'apt']) + @pytest.mark.parametrize("value", ["invalid", "apt"]) def test_python_install_method_check_invalid(self, value, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'method': value, - }], + "python": { + "install": [ + { + "path": ".", + "method": value, + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install.0.method' + assert excinfo.value.key == "python.install.0.method" def test_python_install_requirements_check_valid(self, tmpdir): - apply_fs(tmpdir, {'requirements.txt': ''}) - build = self.get_build_config( + apply_fs(tmpdir, {"requirements.txt": ""}) + build = get_build_config( { - 'python': { - 'install': [{ - 'requirements': 'requirements.txt' - }], + "python": { + "install": [{"requirements": "requirements.txt"}], }, }, source_file=str(tmpdir.join('readthedocs.yml')), @@ -1438,32 +741,36 @@ def test_python_install_requirements_check_valid(self, tmpdir): install = build.python.install assert len(install) == 1 assert isinstance(install[0], PythonInstallRequirements) - assert install[0].requirements == 'requirements.txt' + assert install[0].requirements == "requirements.txt" def test_python_install_requirements_does_not_allow_null(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'requirements': None, - }], + "python": { + "install": [ + { + "path": ".", + "requirements": None, + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install.0.requirements' + assert excinfo.value.key == "python.install.0.requirements" def test_python_install_requirements_error_msg(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'requirements': None, - }], + "python": { + "install": [ + { + "path": ".", + "requirements": None, + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), @@ -1477,190 +784,132 @@ def test_python_install_requirements_error_msg(self, tmpdir): ) def test_python_install_requirements_does_not_allow_empty_string(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'requirements': '', - }], + "python": { + "install": [ + { + "path": ".", + "requirements": "", + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install.0.requirements' - - def test_python_install_requirements_ignores_default(self, tmpdir): - apply_fs(tmpdir, {'requirements.txt': ''}) - build = self.get_build_config( - {}, - {'defaults': {'requirements_file': 'requirements.txt'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - assert build.python.install == [] - - def test_python_install_requirements_priority_over_default(self, tmpdir): - apply_fs(tmpdir, {'requirements.txt': ''}) - build = self.get_build_config( - { - 'python': { - 'install': [{ - 'requirements': 'requirements.txt' - }], - }, - }, - {'defaults': {'requirements_file': 'requirements-default.txt'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].requirements == 'requirements.txt' + assert excinfo.value.key == "python.install.0.requirements" - @pytest.mark.parametrize('value', [3, [], {}]) + @pytest.mark.parametrize("value", [3, [], {}]) def test_python_install_requirements_check_invalid_types(self, value, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'requirements': value, - }], + "python": { + "install": [ + { + "path": ".", + "requirements": value, + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install.0.requirements' + assert excinfo.value.key == "python.install.0.requirements" def test_python_install_path_is_required(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'method': 'pip', - }], + "python": { + "install": [ + { + "method": "pip", + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install.0' + assert excinfo.value.key == "python.install.0" assert excinfo.value.code == CONFIG_REQUIRED def test_python_install_pip_check_valid(self, tmpdir): - build = self.get_build_config( - { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'pip', - }], - }, - }, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].path == '.' - assert install[0].method == PIP - - def test_python_install_pip_have_priority_over_default(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'pip', - }], + "python": { + "install": [ + { + "path": ".", + "method": "pip", + } + ], }, }, - {'defaults': {'install_project': True}}, source_file=str(tmpdir.join('readthedocs.yml')), ) build.validate() install = build.python.install assert len(install) == 1 - assert install[0].path == '.' + assert install[0].path == "." assert install[0].method == PIP def test_python_install_setuptools_check_valid(self, tmpdir): - build = self.get_build_config( - { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'setuptools', - }], - }, - }, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - install = build.python.install - assert len(install) == 1 - assert install[0].path == '.' - assert install[0].method == SETUPTOOLS - - def test_python_install_setuptools_ignores_default(self): - build = self.get_build_config( - {}, - {'defaults': {'install_project': True}}, - ) - build.validate() - assert build.python.install == [] - - def test_python_install_setuptools_priority_over_default(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'setuptools', - }], + "python": { + "install": [ + { + "path": ".", + "method": "setuptools", + } + ], }, }, - {'defaults': {'install_project': False}}, source_file=str(tmpdir.join('readthedocs.yml')), ) build.validate() install = build.python.install assert len(install) == 1 - assert install[0].path == '.' + assert install[0].path == "." assert install[0].method == SETUPTOOLS def test_python_install_allow_empty_list(self): - build = self.get_build_config({'python': {'install': []}},) + build = get_build_config( + {"python": {"install": []}}, + ) build.validate() assert build.python.install == [] def test_python_install_default(self): - build = self.get_build_config({'python': {}}) + build = get_build_config({"python": {}}) build.validate() assert build.python.install == [] - @pytest.mark.parametrize('value', [2, 'string', {}]) + @pytest.mark.parametrize("value", [2, "string", {}]) def test_python_install_check_invalid_type(self, value): - build = self.get_build_config({'python': {'install': value}},) + build = get_build_config( + {"python": {"install": value}}, + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install' + assert excinfo.value.key == "python.install" def test_python_install_extra_requirements_and_pip(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'pip', - 'extra_requirements': ['docs', 'tests'], - }], + "python": { + "install": [ + { + "path": ".", + "method": "pip", + "extra_requirements": ["docs", "tests"], + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), @@ -1668,52 +917,58 @@ def test_python_install_extra_requirements_and_pip(self, tmpdir): build.validate() install = build.python.install assert len(install) == 1 - assert install[0].extra_requirements == ['docs', 'tests'] + assert install[0].extra_requirements == ["docs", "tests"] def test_python_install_extra_requirements_and_setuptools(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'setuptools', - 'extra_requirements': ['docs', 'tests'], - }], + "python": { + "install": [ + { + "path": ".", + "method": "setuptools", + "extra_requirements": ["docs", "tests"], + } + ], } }, source_file=str(tmpdir.join('readthedocs.yml')), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install.0.extra_requirements' + assert excinfo.value.key == "python.install.0.extra_requirements" - @pytest.mark.parametrize('value', [2, 'invalid', {}, '', None]) + @pytest.mark.parametrize("value", [2, "invalid", {}, "", None]) def test_python_install_extra_requirements_check_type(self, value, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'pip', - 'extra_requirements': value, - }], + "python": { + "install": [ + { + "path": ".", + "method": "pip", + "extra_requirements": value, + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'python.install.0.extra_requirements' + assert excinfo.value.key == "python.install.0.extra_requirements" def test_python_install_extra_requirements_allow_empty(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'python': { - 'install': [{ - 'path': '.', - 'method': 'pip', - 'extra_requirements': [], - }], + "python": { + "install": [ + { + "path": ".", + "method": "pip", + "extra_requirements": [], + } + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), @@ -1724,24 +979,31 @@ def test_python_install_extra_requirements_allow_empty(self, tmpdir): assert install[0].extra_requirements == [] def test_python_install_several_respects_order(self, tmpdir): - apply_fs(tmpdir, { - 'one': {}, - 'two': {}, - 'three.txt': '', - }) - build = self.get_build_config( + apply_fs( + tmpdir, { - 'python': { - 'install': [{ - 'path': 'one', - 'method': 'pip', - 'extra_requirements': [], - }, { - 'path': 'two', - 'method': 'setuptools', - }, { - 'requirements': 'three.txt', - }], + "one": {}, + "two": {}, + "three.txt": "", + }, + ) + build = get_build_config( + { + "python": { + "install": [ + { + "path": "one", + "method": "pip", + "extra_requirements": [], + }, + { + "path": "two", + "method": "setuptools", + }, + { + "requirements": "three.txt", + }, + ], }, }, source_file=str(tmpdir.join('readthedocs.yml')), @@ -1757,383 +1019,374 @@ def test_python_install_several_respects_order(self, tmpdir): assert install[1].path == 'two' assert install[1].method == SETUPTOOLS - assert install[2].requirements == 'three.txt' + assert install[2].requirements == "three.txt" - @pytest.mark.parametrize('value', [[], True, 0, 'invalid']) + @pytest.mark.parametrize("value", [[], True, 0, "invalid"]) def test_sphinx_validate_type(self, value): - build = self.get_build_config({'sphinx': value}) + build = get_build_config({"sphinx": value}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'sphinx' + assert excinfo.value.key == "sphinx" def test_sphinx_is_default_doc_type(self): - build = self.get_build_config({}) + build = get_build_config({}) build.validate() assert build.sphinx is not None assert build.mkdocs is None - assert build.doctype == 'sphinx' + assert build.doctype == "sphinx" @pytest.mark.parametrize( 'value,expected', [ - ('html', 'sphinx'), - ('htmldir', 'sphinx_htmldir'), - ('dirhtml', 'sphinx_htmldir'), - ('singlehtml', 'sphinx_singlehtml'), + ("html", "sphinx"), + ("htmldir", "sphinx_htmldir"), + ("dirhtml", "sphinx_htmldir"), + ("singlehtml", "sphinx_singlehtml"), ], ) def test_sphinx_builder_check_valid(self, value, expected): - build = self.get_build_config( - {'sphinx': {'builder': value}}, - {'defaults': {'doctype': expected}}, + build = get_build_config( + {"sphinx": {"builder": value}}, ) build.validate() assert build.sphinx.builder == expected assert build.doctype == expected - @pytest.mark.parametrize('value', [[], True, 0, 'invalid']) + @pytest.mark.parametrize("value", [[], True, 0, "invalid"]) def test_sphinx_builder_check_invalid(self, value): - build = self.get_build_config({'sphinx': {'builder': value}}) + build = get_build_config({"sphinx": {"builder": value}}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'sphinx.builder' + assert excinfo.value.key == "sphinx.builder" def test_sphinx_builder_default(self): - build = self.get_build_config({}) + build = get_build_config({}) build.validate() - build.sphinx.builder == 'sphinx' + build.sphinx.builder == "sphinx" def test_sphinx_builder_ignores_default(self): - build = self.get_build_config( + build = get_build_config( {}, - {'defaults': {'doctype': 'sphinx_singlehtml'}}, ) build.validate() - build.sphinx.builder == 'sphinx' + build.sphinx.builder == "sphinx" def test_sphinx_configuration_check_valid(self, tmpdir): - apply_fs(tmpdir, {'conf.py': ''}) - build = self.get_build_config( - {'sphinx': {'configuration': 'conf.py'}}, - source_file=str(tmpdir.join('readthedocs.yml')), + apply_fs(tmpdir, {"conf.py": ""}) + build = get_build_config( + {"sphinx": {"configuration": "conf.py"}}, + source_file=str(tmpdir.join("readthedocs.yml")), ) build.validate() - assert build.sphinx.configuration == 'conf.py' + assert build.sphinx.configuration == "conf.py" def test_sphinx_cant_be_used_with_mkdocs(self, tmpdir): - apply_fs(tmpdir, {'conf.py': ''}) - build = self.get_build_config( + apply_fs(tmpdir, {"conf.py": ""}) + build = get_build_config( { - 'sphinx': {'configuration': 'conf.py'}, - 'mkdocs': {}, + "sphinx": {"configuration": "conf.py"}, + "mkdocs": {}, }, - source_file=str(tmpdir.join('readthedocs.yml')), + source_file=str(tmpdir.join("readthedocs.yml")), ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == '.' + assert excinfo.value.key == "." def test_sphinx_configuration_allow_null(self): - build = self.get_build_config({'sphinx': {'configuration': None}},) + build = get_build_config( + {"sphinx": {"configuration": None}}, + ) build.validate() assert build.sphinx.configuration is None def test_sphinx_configuration_check_default(self): - build = self.get_build_config({}) - build.validate() - assert build.sphinx.configuration is None - - def test_sphinx_configuration_respects_default(self, tmpdir): - apply_fs(tmpdir, {'conf.py': ''}) - build = self.get_build_config( - {}, - {'defaults': {'sphinx_configuration': 'conf.py'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - assert build.sphinx.configuration == 'conf.py' - - def test_sphinx_configuration_default_can_be_none(self, tmpdir): - apply_fs(tmpdir, {'conf.py': ''}) - build = self.get_build_config( - {}, - {'defaults': {'sphinx_configuration': None}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) + build = get_build_config({}) build.validate() assert build.sphinx.configuration is None - def test_sphinx_configuration_priorities_over_default(self, tmpdir): - apply_fs(tmpdir, {'conf.py': '', 'conf-default.py': ''}) - build = self.get_build_config( - {'sphinx': {'configuration': 'conf.py'}}, - {'defaults': {'sphinx_configuration': 'conf-default.py'}}, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - assert build.sphinx.configuration == 'conf.py' - - @pytest.mark.parametrize('value', [[], True, 0, {}]) + @pytest.mark.parametrize("value", [[], True, 0, {}]) def test_sphinx_configuration_validate_type(self, value): - build = self.get_build_config({'sphinx': {'configuration': value}},) + build = get_build_config( + {"sphinx": {"configuration": value}}, + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'sphinx.configuration' + assert excinfo.value.key == "sphinx.configuration" - @pytest.mark.parametrize('value', [True, False]) + @pytest.mark.parametrize("value", [True, False]) def test_sphinx_fail_on_warning_check_valid(self, value): - build = self.get_build_config({'sphinx': {'fail_on_warning': value}}) + build = get_build_config({"sphinx": {"fail_on_warning": value}}) build.validate() assert build.sphinx.fail_on_warning is value - @pytest.mark.parametrize('value', [[], 'invalid', 5]) + @pytest.mark.parametrize("value", [[], "invalid", 5]) def test_sphinx_fail_on_warning_check_invalid(self, value): - build = self.get_build_config({'sphinx': {'fail_on_warning': value}}) + build = get_build_config({"sphinx": {"fail_on_warning": value}}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'sphinx.fail_on_warning' + assert excinfo.value.key == "sphinx.fail_on_warning" def test_sphinx_fail_on_warning_check_default(self): - build = self.get_build_config({}) + build = get_build_config({}) build.validate() assert build.sphinx.fail_on_warning is False - @pytest.mark.parametrize('value', [[], True, 0, 'invalid']) + @pytest.mark.parametrize("value", [[], True, 0, "invalid"]) def test_mkdocs_validate_type(self, value): - build = self.get_build_config({'mkdocs': value}) + build = get_build_config({"mkdocs": value}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'mkdocs' + assert excinfo.value.key == "mkdocs" def test_mkdocs_default(self): - build = self.get_build_config({}) + build = get_build_config({}) build.validate() assert build.mkdocs is None def test_mkdocs_configuration_check_valid(self, tmpdir): - apply_fs(tmpdir, {'mkdocs.yml': ''}) - build = self.get_build_config( - {'mkdocs': {'configuration': 'mkdocs.yml'}}, - {'defaults': {'doctype': 'mkdocs'}}, - source_file=str(tmpdir.join('readthedocs.yml')), + apply_fs(tmpdir, {"mkdocs.yml": ""}) + build = get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, + source_file=str(tmpdir.join("readthedocs.yml")), ) build.validate() - assert build.mkdocs.configuration == 'mkdocs.yml' - assert build.doctype == 'mkdocs' + assert build.mkdocs.configuration == "mkdocs.yml" + assert build.doctype == "mkdocs" assert build.sphinx is None def test_mkdocs_configuration_allow_null(self): - build = self.get_build_config( - {'mkdocs': {'configuration': None}}, - {'defaults': {'doctype': 'mkdocs'}}, + build = get_build_config( + {"mkdocs": {"configuration": None}}, ) build.validate() assert build.mkdocs.configuration is None def test_mkdocs_configuration_check_default(self): - build = self.get_build_config( - {'mkdocs': {}}, - {'defaults': {'doctype': 'mkdocs'}}, + build = get_build_config( + {"mkdocs": {}}, ) build.validate() assert build.mkdocs.configuration is None - @pytest.mark.parametrize('value', [[], True, 0, {}]) + @pytest.mark.parametrize("value", [[], True, 0, {}]) def test_mkdocs_configuration_validate_type(self, value): - build = self.get_build_config( - {'mkdocs': {'configuration': value}}, - {'defaults': {'doctype': 'mkdocs'}}, + build = get_build_config( + {"mkdocs": {"configuration": value}}, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'mkdocs.configuration' + assert excinfo.value.key == "mkdocs.configuration" - @pytest.mark.parametrize('value', [True, False]) + @pytest.mark.parametrize("value", [True, False]) def test_mkdocs_fail_on_warning_check_valid(self, value): - build = self.get_build_config( - {'mkdocs': {'fail_on_warning': value}}, - {'defaults': {'doctype': 'mkdocs'}}, + build = get_build_config( + {"mkdocs": {"fail_on_warning": value}}, ) build.validate() assert build.mkdocs.fail_on_warning is value - @pytest.mark.parametrize('value', [[], 'invalid', 5]) + @pytest.mark.parametrize("value", [[], "invalid", 5]) def test_mkdocs_fail_on_warning_check_invalid(self, value): - build = self.get_build_config( - {'mkdocs': {'fail_on_warning': value}}, - {'defaults': {'doctype': 'mkdocs'}}, + build = get_build_config( + {"mkdocs": {"fail_on_warning": value}}, ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'mkdocs.fail_on_warning' + assert excinfo.value.key == "mkdocs.fail_on_warning" def test_mkdocs_fail_on_warning_check_default(self): - build = self.get_build_config( - {'mkdocs': {}}, - {'defaults': {'doctype': 'mkdocs'}}, + build = get_build_config( + {"mkdocs": {}}, ) build.validate() assert build.mkdocs.fail_on_warning is False def test_submodule_defaults(self): - build = self.get_build_config({}) + build = get_build_config({}) build.validate() assert build.submodules.include == [] assert build.submodules.exclude == ALL assert build.submodules.recursive is False - @pytest.mark.parametrize('value', [[], 'invalid', 0]) + @pytest.mark.parametrize("value", [[], "invalid", 0]) def test_submodules_check_invalid_type(self, value): - build = self.get_build_config({'submodules': value}) + build = get_build_config({"submodules": value}) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'submodules' + assert excinfo.value.key == "submodules" def test_submodules_include_check_valid(self): - build = self.get_build_config({ - 'submodules': { - 'include': ['one', 'two'], - }, - }) + build = get_build_config( + { + "submodules": { + "include": ["one", "two"], + }, + } + ) build.validate() - assert build.submodules.include == ['one', 'two'] + assert build.submodules.include == ["one", "two"] assert build.submodules.exclude == [] assert build.submodules.recursive is False - @pytest.mark.parametrize('value', ['invalid', True, 0, {}]) + @pytest.mark.parametrize("value", ["invalid", True, 0, {}]) def test_submodules_include_check_invalid(self, value): - build = self.get_build_config({ - 'submodules': { - 'include': value, - }, - }) + build = get_build_config( + { + "submodules": { + "include": value, + }, + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'submodules.include' + assert excinfo.value.key == "submodules.include" def test_submodules_include_allows_all_keyword(self): - build = self.get_build_config({ - 'submodules': { - 'include': 'all', - }, - }) + build = get_build_config( + { + "submodules": { + "include": "all", + }, + } + ) build.validate() assert build.submodules.include == ALL assert build.submodules.exclude == [] assert build.submodules.recursive is False def test_submodules_exclude_check_valid(self): - build = self.get_build_config({ - 'submodules': { - 'exclude': ['one', 'two'], - }, - }) + build = get_build_config( + { + "submodules": { + "exclude": ["one", "two"], + }, + } + ) build.validate() assert build.submodules.include == [] - assert build.submodules.exclude == ['one', 'two'] + assert build.submodules.exclude == ["one", "two"] assert build.submodules.recursive is False - @pytest.mark.parametrize('value', ['invalid', True, 0, {}]) + @pytest.mark.parametrize("value", ["invalid", True, 0, {}]) def test_submodules_exclude_check_invalid(self, value): - build = self.get_build_config({ - 'submodules': { - 'exclude': value, - }, - }) + build = get_build_config( + { + "submodules": { + "exclude": value, + }, + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'submodules.exclude' + assert excinfo.value.key == "submodules.exclude" def test_submodules_exclude_allows_all_keyword(self): - build = self.get_build_config({ - 'submodules': { - 'exclude': 'all', - }, - }) + build = get_build_config( + { + "submodules": { + "exclude": "all", + }, + } + ) build.validate() assert build.submodules.include == [] assert build.submodules.exclude == ALL assert build.submodules.recursive is False def test_submodules_cant_exclude_and_include(self): - build = self.get_build_config({ - 'submodules': { - 'include': ['two'], - 'exclude': ['one'], - }, - }) + build = get_build_config( + { + "submodules": { + "include": ["two"], + "exclude": ["one"], + }, + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'submodules' + assert excinfo.value.key == "submodules" def test_submodules_can_exclude_include_be_empty(self): - build = self.get_build_config({ - 'submodules': { - 'exclude': 'all', - 'include': [], - }, - }) + build = get_build_config( + { + "submodules": { + "exclude": "all", + "include": [], + }, + } + ) build.validate() assert build.submodules.include == [] assert build.submodules.exclude == ALL assert build.submodules.recursive is False - @pytest.mark.parametrize('value', [True, False]) + @pytest.mark.parametrize("value", [True, False]) def test_submodules_recursive_check_valid(self, value): - build = self.get_build_config({ - 'submodules': { - 'include': ['one', 'two'], - 'recursive': value, - }, - }) + build = get_build_config( + { + "submodules": { + "include": ["one", "two"], + "recursive": value, + }, + } + ) build.validate() - assert build.submodules.include == ['one', 'two'] + assert build.submodules.include == ["one", "two"] assert build.submodules.exclude == [] assert build.submodules.recursive is value - @pytest.mark.parametrize('value', [[], 'invalid', 5]) + @pytest.mark.parametrize("value", [[], "invalid", 5]) def test_submodules_recursive_check_invalid(self, value): - build = self.get_build_config({ - 'submodules': { - 'include': ['one', 'two'], - 'recursive': value, - }, - }) + build = get_build_config( + { + "submodules": { + "include": ["one", "two"], + "recursive": value, + }, + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'submodules.recursive' + assert excinfo.value.key == "submodules.recursive" def test_submodules_recursive_explicit_default(self): - build = self.get_build_config({ - 'submodules': { - 'include': [], - 'recursive': False, - }, - }) + build = get_build_config( + { + "submodules": { + "include": [], + "recursive": False, + }, + } + ) build.validate() assert build.submodules.include == [] assert build.submodules.exclude == ALL assert build.submodules.recursive is False - build = self.get_build_config({ - 'submodules': { - 'exclude': [], - 'recursive': False, - }, - }) + build = get_build_config( + { + "submodules": { + "exclude": [], + "recursive": False, + }, + } + ) build.validate() assert build.submodules.include == [] assert build.submodules.exclude == [] assert build.submodules.recursive is False - @pytest.mark.parametrize('value', ['invalid', True, 0, []]) + @pytest.mark.parametrize("value", ["invalid", True, 0, []]) def test_search_invalid_type(self, value): - build = self.get_build_config({ - 'search': value, - }) + build = get_build_config( + { + "search": value, + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'search' + assert excinfo.value.key == "search" @pytest.mark.parametrize( 'value', @@ -2142,59 +1395,68 @@ def test_search_invalid_type(self, value): True, 0, [], - {'foo/bar': 11}, - {'foo/bar': -11}, - {'foo/bar': 2.5}, - {'foo/bar': 'bar'}, - {'/': 1}, - {'/foo/..': 1}, - {'..': 1}, - {'/foo/bar/../../../': 1}, - {10: 'bar'}, + {"foo/bar": 11}, + {"foo/bar": -11}, + {"foo/bar": 2.5}, + {"foo/bar": "bar"}, + {"/": 1}, + {"/foo/..": 1}, + {"..": 1}, + {"/foo/bar/../../../": 1}, + {10: "bar"}, {10: 0}, ], ) def test_search_ranking_invalid_type(self, value): - build = self.get_build_config({ - 'search': {'ranking': value}, - }) + build = get_build_config( + { + "search": {"ranking": value}, + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'search.ranking' + assert excinfo.value.key == "search.ranking" - @pytest.mark.parametrize('value', list(range(-10, 10 + 1))) + @pytest.mark.parametrize("value", list(range(-10, 10 + 1))) def test_search_valid_ranking(self, value): - build = self.get_build_config({ - 'search': { - 'ranking': { - 'foo/bar': value, - 'bar/foo': value, + build = get_build_config( + { + "search": { + "ranking": { + "foo/bar": value, + "bar/foo": value, + }, }, - }, - }) + } + ) build.validate() assert build.search.ranking == {'foo/bar': value, 'bar/foo': value} - @pytest.mark.parametrize('path, expected', [ - ('/foo/bar', 'foo/bar'), - ('///foo//bar', 'foo/bar'), - ('///foo//bar/', 'foo/bar'), - ('/foo/bar/../', 'foo'), - ('/foo*', 'foo*'), - ('/foo/bar/*', 'foo/bar/*'), - ('/foo/bar?/*', 'foo/bar?/*'), - ('foo/[bc]ar/*/', 'foo/[bc]ar/*'), - ('*', '*'), - ('index.html', 'index.html'), - ]) + @pytest.mark.parametrize( + "path, expected", + [ + ("/foo/bar", "foo/bar"), + ("///foo//bar", "foo/bar"), + ("///foo//bar/", "foo/bar"), + ("/foo/bar/../", "foo"), + ("/foo*", "foo*"), + ("/foo/bar/*", "foo/bar/*"), + ("/foo/bar?/*", "foo/bar?/*"), + ("foo/[bc]ar/*/", "foo/[bc]ar/*"), + ("*", "*"), + ("index.html", "index.html"), + ], + ) def test_search_ranking_normilize_path(self, path, expected): - build = self.get_build_config({ - 'search': { - 'ranking': { - path: 1, + build = get_build_config( + { + "search": { + "ranking": { + path: 1, + }, }, - }, - }) + } + ) build.validate() assert build.search.ranking == {expected: 1} @@ -2205,198 +1467,150 @@ def test_search_ranking_normilize_path(self, path, expected): True, 0, [2, 3], - {'foo/bar': 11}, + {"foo/bar": 11}, ], ) def test_search_ignore_invalid_type(self, value): - build = self.get_build_config({ - 'search': {'ignore': value}, - }) + build = get_build_config( + { + "search": {"ignore": value}, + } + ) with raises(InvalidConfig) as excinfo: build.validate() - assert excinfo.value.key == 'search.ignore' - - @pytest.mark.parametrize('path, expected', [ - ('/foo/bar', 'foo/bar'), - ('///foo//bar', 'foo/bar'), - ('///foo//bar/', 'foo/bar'), - ('/foo/bar/../', 'foo'), - ('/foo*', 'foo*'), - ('/foo/bar/*', 'foo/bar/*'), - ('/foo/bar?/*', 'foo/bar?/*'), - ('foo/[bc]ar/*/', 'foo/[bc]ar/*'), - ('*', '*'), - ('index.html', 'index.html'), - ]) + assert excinfo.value.key == "search.ignore" + + @pytest.mark.parametrize( + "path, expected", + [ + ("/foo/bar", "foo/bar"), + ("///foo//bar", "foo/bar"), + ("///foo//bar/", "foo/bar"), + ("/foo/bar/../", "foo"), + ("/foo*", "foo*"), + ("/foo/bar/*", "foo/bar/*"), + ("/foo/bar?/*", "foo/bar?/*"), + ("foo/[bc]ar/*/", "foo/[bc]ar/*"), + ("*", "*"), + ("index.html", "index.html"), + ], + ) def test_search_ignore_valid_type(self, path, expected): - build = self.get_build_config({ - 'search': { - 'ignore': [path], - }, - }) + build = get_build_config( + { + "search": { + "ignore": [path], + }, + } + ) build.validate() assert build.search.ignore == [expected] - @pytest.mark.parametrize('value,key', [ - ({'typo': 'something'}, 'typo'), - ( - { - 'pyton': { - 'version': 'another typo', - } - }, - 'pyton.version' - ), - ( - { - 'build': { - 'image': 'latest', - 'extra': 'key', - } - }, - 'build.extra' - ), - ( - { - 'python': { - 'install': [{ - 'path': '.', - }, { - 'path': '.', - 'method': 'pip', - 'invalid': 'key', - }] - } - }, - 'python.install.1.invalid' - ) - ]) + @pytest.mark.parametrize( + "value,key", + [ + ({"typo": "something"}, "typo"), + ( + { + "pyton": { + "version": "another typo", + } + }, + "pyton.version", + ), + ( + { + "build": { + "os": "ubuntu-22.04", + "tools": {"python": "3"}, + "extra": "key", + } + }, + "build.extra", + ), + ( + { + "python": { + "install": [ + { + "path": ".", + }, + { + "path": ".", + "method": "pip", + "invalid": "key", + }, + ] + } + }, + "python.install.1.invalid", + ), + ], + ) def test_strict_validation(self, value, key): - build = self.get_build_config(value) + build = get_build_config(value) with raises(InvalidConfig) as excinfo: build.validate() assert excinfo.value.key == key assert excinfo.value.code == INVALID_KEY - def test_strict_validation_pops_all_keys(self): - build = self.get_build_config({ - 'version': 2, - 'python': { - 'version': 3, - }, - }) - build.validate() - assert build._raw_config == {} - @pytest.mark.parametrize( 'value,expected', [ ({}, []), - ({'one': 1}, ['one']), - ({'one': {'two': 3}}, ['one', 'two']), - (OrderedDict([('one', 1), ('two', 2)]), ['one']), - (OrderedDict([('one', {'two': 2}), ('three', 3)]), ['one', 'two']), + ({"one": 1}, ["one"]), + ({"one": {"two": 3}}, ["one", "two"]), + (OrderedDict([("one", 1), ("two", 2)]), ["one"]), + (OrderedDict([("one", {"two": 2}), ("three", 3)]), ["one", "two"]), ], ) def test_get_extra_key(self, value, expected): - build = self.get_build_config({}) + build = get_build_config({}) assert build._get_extra_key(value) == expected def test_pop_config_single(self): - build = self.get_build_config({'one': 1}) - build.pop_config('one') + build = get_build_config({}) + build.pop_config("version") + build.pop_config("build") assert build._raw_config == {} def test_pop_config_nested(self): - build = self.get_build_config({'one': {'two': 2}}) - build.pop_config('one.two') + build = get_build_config({}) + build.pop_config("version") + build.pop_config("build.os") + build.pop_config("build.tools") assert build._raw_config == {} def test_pop_config_nested_with_residue(self): - build = self.get_build_config({'one': {'two': 2, 'three': 3}}) - build.pop_config('one.two') - assert build._raw_config == {'one': {'three': 3}} + build = get_build_config({}) + build.pop_config("version") + build.pop_config("build.tools") + assert build._raw_config == {"build": {"os": "ubuntu-22.04"}} def test_pop_config_default_none(self): - build = self.get_build_config({'one': {'two': 2, 'three': 3}}) - assert build.pop_config('one.four') is None - assert build._raw_config == {'one': {'two': 2, 'three': 3}} + build = get_build_config({}) + assert build.pop_config("one.four") is None def test_pop_config_default(self): - build = self.get_build_config({'one': {'two': 2, 'three': 3}}) - assert build.pop_config('one.four', 4) == 4 - assert build._raw_config == {'one': {'two': 2, 'three': 3}} + build = get_build_config({}) + assert build.pop_config("one.four", 4) == 4 def test_pop_config_raise_exception(self): - build = self.get_build_config({'one': {'two': 2, 'three': 3}}) + build = get_build_config({}) with raises(ValidationError) as excinfo: - build.pop_config('one.four', raise_ex=True) - assert excinfo.value.value == 'four' + build.pop_config("build.invalid", raise_ex=True) + assert excinfo.value.value == "invalid" assert excinfo.value.code == VALUE_NOT_FOUND - def test_as_dict(self, tmpdir): - apply_fs(tmpdir, {'requirements.txt': ''}) - build = self.get_build_config( - { - 'version': 2, - 'formats': ['pdf'], - 'python': { - 'version': '3.6', - 'install': [{ - 'requirements': 'requirements.txt', - }], - }, - }, - source_file=str(tmpdir.join('readthedocs.yml')), - ) - build.validate() - expected_dict = { - 'version': '2', - 'formats': ['pdf'], - 'python': { - 'version': '3.6', - 'install': [{ - 'requirements': 'requirements.txt', - }], - }, - 'build': { - 'image': 'readthedocs/build:latest', - 'apt_packages': [], - }, - 'conda': None, - 'sphinx': { - 'builder': 'sphinx', - 'configuration': None, - 'fail_on_warning': False, - }, - 'mkdocs': None, - 'doctype': 'sphinx', - 'submodules': { - 'include': [], - 'exclude': ALL, - 'recursive': False, - }, - 'search': { - 'ranking': {}, - 'ignore': [ - 'search.html', - 'search/index.html', - '404.html', - '404/index.html', - ], - }, - } - assert build.as_dict() == expected_dict - def test_as_dict_new_build_config(self, tmpdir): - build = self.get_build_config( + build = get_build_config( { - 'version': 2, - 'formats': ['pdf'], - 'build': { - 'os': 'ubuntu-20.04', - 'tools': { - 'python': '3.9', - 'nodejs': '16', + "version": 2, + "formats": ["pdf"], + "build": { + "os": "ubuntu-20.04", + "tools": { + "python": "3.9", + "nodejs": "16", }, }, 'python': { @@ -2412,7 +1626,6 @@ def test_as_dict_new_build_config(self, tmpdir): 'version': '2', 'formats': ['pdf'], 'python': { - 'version': None, 'install': [{ 'requirements': 'requirements.txt', }], diff --git a/readthedocs/doc_builder/config.py b/readthedocs/doc_builder/config.py index be554203829..dc9adb84786 100644 --- a/readthedocs/doc_builder/config.py +++ b/readthedocs/doc_builder/config.py @@ -1,13 +1,7 @@ """An API to load config from a readthedocs.yml file.""" -from os import path -from readthedocs.config import BuildConfigV1 from readthedocs.config import load as load_config -from readthedocs.projects.models import ProjectConfigurationError - -from ..config.config import DefaultConfigFileNotFound -from .constants import DOCKER_IMAGE, DOCKER_IMAGE_SETTINGS def load_yaml_config(version, readthedocs_yaml_path=None): @@ -21,56 +15,18 @@ def load_yaml_config(version, readthedocs_yaml_path=None): load instead of using defaults. """ checkout_path = version.project.checkout_path(version.slug) - project = version.project + + # TODO: review this function since we are removing all the defaults for BuildConfigV2 as well. + # NOTE: all the configuration done on the UI will make no effect at all from now on. # Get build image to set up the python version validation. Pass in the # build image python limitations to the loaded config so that the versions # can be rejected at validation - img_name = project.container_image or DOCKER_IMAGE - python_version = '3' if project.python_interpreter == 'python3' else '2' - try: - sphinx_configuration = path.join( - version.get_conf_py_path(), - 'conf.py', - ) - except ProjectConfigurationError: - sphinx_configuration = None - - env_config = { - 'build': { - 'image': img_name, - }, - 'defaults': { - 'install_project': project.install_project, - 'formats': get_default_formats(project), - 'requirements_file': project.requirements_file, - 'python_version': python_version, - 'sphinx_configuration': sphinx_configuration, - 'build_image': project.container_image, - 'doctype': project.documentation_type, - }, - } - img_settings = DOCKER_IMAGE_SETTINGS.get(img_name, None) - if img_settings: - env_config.update(img_settings) - - try: - config = load_config( - path=checkout_path, - env_config=env_config, - readthedocs_yaml_path=readthedocs_yaml_path, - ) - except DefaultConfigFileNotFound: - # Default to use v1 with some defaults from the web interface - # if we don't find a configuration file. - config = BuildConfigV1( - env_config=env_config, - raw_config={}, - base_path=checkout_path, - source_file=None, - ) - config.validate() + config = load_config( + path=checkout_path, + readthedocs_yaml_path=readthedocs_yaml_path, + ) return config diff --git a/readthedocs/doc_builder/constants.py b/readthedocs/doc_builder/constants.py index cb41d94b2c6..73228946e28 100644 --- a/readthedocs/doc_builder/constants.py +++ b/readthedocs/doc_builder/constants.py @@ -14,7 +14,6 @@ DOCKER_SOCKET = settings.DOCKER_SOCKET DOCKER_VERSION = settings.DOCKER_VERSION DOCKER_IMAGE = settings.DOCKER_IMAGE -DOCKER_IMAGE_SETTINGS = settings.DOCKER_IMAGE_SETTINGS DOCKER_LIMITS = settings.DOCKER_LIMITS DOCKER_TIMEOUT_EXIT_CODE = 42 DOCKER_OOM_EXIT_CODE = 137 diff --git a/readthedocs/doc_builder/director.py b/readthedocs/doc_builder/director.py index 591186bfdbe..fcbb4c47c2e 100644 --- a/readthedocs/doc_builder/director.py +++ b/readthedocs/doc_builder/director.py @@ -183,8 +183,7 @@ def setup_environment(self): self.run_build_job("post_system_dependencies") # Install all ``build.tools`` specified by the user - if self.data.config.using_build_tools: - self.install_build_tools() + self.install_build_tools() self.run_build_job("pre_create_environment") self.create_environment() diff --git a/readthedocs/doc_builder/python_environments.py b/readthedocs/doc_builder/python_environments.py index da33474c37f..0a359a4acf7 100644 --- a/readthedocs/doc_builder/python_environments.py +++ b/readthedocs/doc_builder/python_environments.py @@ -1,7 +1,6 @@ """An abstraction over virtualenv and Conda environments.""" import copy -import itertools import os import structlog @@ -12,7 +11,6 @@ from readthedocs.config.models import PythonInstall, PythonInstallRequirements from readthedocs.core.utils.filesystem import safe_open from readthedocs.doc_builder.config import load_yaml_config -from readthedocs.doc_builder.loader import get_builder_class from readthedocs.projects.exceptions import UserFileNotFound from readthedocs.projects.models import Feature @@ -160,21 +158,10 @@ def install_core_requirements(self): '--no-cache-dir', ] - if self.project.has_feature(Feature.INSTALL_LATEST_CORE_REQUIREMENTS): - self._install_latest_requirements(pip_install_cmd) - else: - self._install_old_requirements(pip_install_cmd) + self._install_latest_requirements(pip_install_cmd) def _install_latest_requirements(self, pip_install_cmd): - """ - Install all the latest core requirements. - - By enabling the feature flag ``INSTALL_LATEST_CORE_REQUIREMENTS`` - projects will automatically get installed all the latest core - requirements: pip, setuptools, sphinx, readthedocs-sphinx-ext and mkdocs. - - This is the new behavior and where we are moving towards. - """ + """Install all the latest core requirements.""" # First, upgrade pip and setuptools to their latest versions cmd = pip_install_cmd + ["pip", "setuptools"] self.build_env.run( @@ -204,80 +191,6 @@ def _install_latest_requirements(self, pip_install_cmd): cwd=self.checkout_path, ) - def _install_old_requirements(self, pip_install_cmd): - """ - Install old core requirements. - - There are bunch of feature flags that will be taken in consideration to - decide whether or not upgrade some of the core dependencies to their - latest versions. - - This is the old behavior and the one we want to get rid off. - """ - # Install latest pip and setuptools first, - # so it is used when installing the other requirements. - pip_version = self.project.get_feature_value( - Feature.DONT_INSTALL_LATEST_PIP, - # 20.3 uses the new resolver by default. - positive='pip<20.3', - negative='pip', - ) - # Installing a project with setup.py install is deprecated - # in new versions of setuptools, so we need to pin setuptools - # to a supported version if the project is using setup.py install. - setuptools_version = ( - "setuptools<58.3.0" - if self.config.is_using_setup_py_install - else "setuptools" - ) - cmd = pip_install_cmd + [pip_version, setuptools_version] - self.build_env.run( - *cmd, - bin_path=self.venv_bin(), - cwd=self.checkout_path, - ) - - requirements = [] - - # Unpin Pillow on newer Python versions to avoid re-compiling - # https://pillow.readthedocs.io/en/stable/installation.html#python-support - if self.config.python.version in ("2.7", "3.4", "3.5", "3.6", "3.7"): - requirements.append("pillow==5.4.1") - else: - requirements.append("pillow") - - requirements.extend( - [ - "mock==1.0.1", - "alabaster>=0.7,<0.8,!=0.7.5", - "commonmark==0.9.1", - "recommonmark==0.5.0", - ] - ) - - if self.config.doctype == 'mkdocs': - requirements.append("mkdocs") - else: - requirements.extend( - [ - "sphinx", - "sphinx-rtd-theme", - self.project.get_feature_value( - Feature.USE_SPHINX_RTD_EXT_LATEST, - positive="readthedocs-sphinx-ext", - negative="readthedocs-sphinx-ext<2.3", - ), - ] - ) - - cmd = copy.copy(pip_install_cmd) - cmd.extend(requirements) - self.build_env.run( - *cmd, - bin_path=self.venv_bin(), - cwd=self.checkout_path, - ) - def install_requirements_file(self, install): """ Install a requirements file using pip. @@ -286,27 +199,6 @@ def install_requirements_file(self, install): :type install: readthedocs.config.models.PythonInstallRequirements """ requirements_file_path = install.requirements - if requirements_file_path is None: - # This only happens when the config file is from v1. - # We try to find a requirements file. - builder_class = get_builder_class(self.config.doctype) - docs_dir = ( - builder_class( - build_env=self.build_env, - python_env=self, - ).docs_dir() - ) - paths = [docs_dir, ''] - req_files = ['pip_requirements.txt', 'requirements.txt'] - for path, req_file in itertools.product(paths, req_files): - test_path = os.path.join(self.checkout_path, path, req_file) - if os.path.exists(test_path): - requirements_file_path = os.path.relpath( - test_path, - self.checkout_path, - ) - break - if requirements_file_path: args = [ self.venv_bin(filename='python'), @@ -351,10 +243,7 @@ def conda_bin_name(self): See https://github.com/QuantStack/mamba """ - # Config file using ``build.tools.python`` - if self.config.using_build_tools: - return self.config.python_interpreter - return 'conda' + return self.config.python_interpreter def setup_base(self): if self.project.has_feature(Feature.CONDA_APPEND_CORE_REQUIREMENTS): @@ -420,10 +309,7 @@ def _append_core_requirements(self): else: # Append conda dependencies directly to ``dependencies`` and pip # dependencies to ``dependencies.pip`` - if self.project.has_feature(Feature.INSTALL_LATEST_CORE_REQUIREMENTS): - pip_requirements, conda_requirements = self._get_new_core_requirements() - else: - pip_requirements, conda_requirements = self._get_old_core_requirements() + pip_requirements, conda_requirements = self._get_core_requirements() dependencies = environment.get('dependencies', []) pip_dependencies = {'pip': pip_requirements} @@ -461,7 +347,7 @@ def _append_core_requirements(self): 'environment file.', ) - def _get_new_core_requirements(self): + def _get_core_requirements(self): # Use conda for requirements it packages conda_requirements = [] @@ -476,26 +362,6 @@ def _get_new_core_requirements(self): return pip_requirements, conda_requirements - def _get_old_core_requirements(self): - # Use conda for requirements it packages - conda_requirements = [ - 'mock', - 'pillow', - ] - - # Install pip-only things. - pip_requirements = [ - 'recommonmark', - ] - - if self.config.doctype == 'mkdocs': - pip_requirements.append('mkdocs') - else: - pip_requirements.append('readthedocs-sphinx-ext') - conda_requirements.extend(['sphinx', 'sphinx_rtd_theme']) - - return pip_requirements, conda_requirements - def install_core_requirements(self): """Install basic Read the Docs requirements into the Conda env.""" @@ -505,7 +371,7 @@ def install_core_requirements(self): # create`` step. return - pip_requirements, conda_requirements = self._get_old_core_requirements() + pip_requirements, conda_requirements = self._get_core_requirements() # Install requirements via ``conda install`` command if they were # not appended to the ``environment.yml`` file. cmd = [ diff --git a/readthedocs/projects/forms.py b/readthedocs/projects/forms.py index 667fd1b278e..fc812c54a7b 100644 --- a/readthedocs/projects/forms.py +++ b/readthedocs/projects/forms.py @@ -216,13 +216,13 @@ class Meta: ) # These that can be set per-version using a config file. per_version_settings = ( - 'documentation_type', - 'requirements_file', - 'python_interpreter', - 'install_project', - 'conf_py_file', - 'enable_pdf_build', - 'enable_epub_build', + "documentation_type", + "requirements_file", + "python_interpreter", + "install_project", + "conf_py_file", + "enable_pdf_build", + "enable_epub_build", ) fields = ( *per_project_settings, @@ -255,6 +255,11 @@ def __init__(self, *args, **kwargs): self.fields.pop(field) per_project_settings.remove(field) + # TODO: remove the "Global settings" fieldset since we only have one + # fieldset not. Also, considering merging this "Advanced settings" with + # the regular "Settings" tab. Also also, take into account that we may + # want to add a new tab for "Read the Docs Addons" to configure each of + # them from there. field_sets = [ Fieldset( _("Global settings"), @@ -288,6 +293,11 @@ def __init__(self, *args, **kwargs): else: self.fields['default_version'].widget.attrs['readonly'] = True + # Disable "per_version_settings" because they are deprecated. + # This fieldset will be removed in the next few weeks, after giving users some time to perform the migration. + for field in self.Meta.per_version_settings: + self.fields[field].disabled = True + self.setup_external_builds_option() def setup_external_builds_option(self): diff --git a/readthedocs/projects/models.py b/readthedocs/projects/models.py index 2161be67ce4..a0047e36289 100644 --- a/readthedocs/projects/models.py +++ b/readthedocs/projects/models.py @@ -271,26 +271,26 @@ class Project(models.Model): ), ) requirements_file = models.CharField( - _('Requirements file'), + _("Requirements file"), max_length=255, default=None, null=True, blank=True, help_text=_( - 'A ' - 'pip requirements file needed to build your documentation. ' - 'Path from the root of your project.', + "pip requirements file needed to build your documentation. " + "Path from the root of your project.", ), ) documentation_type = models.CharField( - _('Documentation type'), + _("Documentation type"), max_length=20, choices=constants.DOCUMENTATION_CHOICES, - default='sphinx', + default="sphinx", help_text=_( 'Type of documentation you are building. More info on sphinx builders.', ), ) @@ -414,44 +414,6 @@ class Project(models.Model): help_text=_('Show warning banner in non-stable nor latest versions.'), ) - # Sphinx specific build options. - enable_epub_build = models.BooleanField( - _('Enable EPUB build'), - default=False, - help_text=_( - 'Create a EPUB version of your documentation with each build.', - ), - ) - enable_pdf_build = models.BooleanField( - _('Enable PDF build'), - default=False, - help_text=_( - 'Create a PDF version of your documentation with each build.', - ), - ) - - # Other model data. - path = models.CharField( - _('Path'), - max_length=255, - editable=False, - help_text=_( - 'The directory where ' - 'conf.py lives', - ), - ) - conf_py_file = models.CharField( - _('Python configuration file'), - max_length=255, - default="", - blank=True, - help_text=_( - "Path from project root to conf.py file " - "(ex. docs/conf.py). " - "Leave blank if you want us to find it for you.", - ), - ) - readthedocs_yaml_path = models.CharField( _("Path for .readthedocs.yaml"), max_length=1024, @@ -484,37 +446,6 @@ class Project(models.Model): ), ) - install_project = models.BooleanField( - _('Install Project'), - help_text=_( - 'Install your project inside a virtualenv using setup.py ' - 'install', - ), - default=False, - ) - - # This model attribute holds the python interpreter used to create the - # virtual environment - python_interpreter = models.CharField( - _('Python Interpreter'), - max_length=20, - choices=constants.PYTHON_CHOICES, - default='python3', - help_text=_( - 'The Python interpreter used to create the virtual ' - 'environment.', - ), - ) - - # TODO: remove `use_system_packages` after deploying. - # This field is not used anymore. - use_system_packages = models.BooleanField( - _("Use system packages"), - help_text=_( - "Give the virtual environment access to the global site-packages dir.", - ), - default=False, - ) privacy_level = models.CharField( _('Privacy Level'), max_length=20, @@ -587,6 +518,100 @@ class Project(models.Model): blank=True, ) + # TODO: remove the following fields since they all are going to be ignored + # by the application when we start requiring a ``.readthedocs.yaml`` file. + # These fields are: + # - requirements_file + # - documentation_type + # - enable_epub_build + # - enable_pdf_build + # - path + # - conf_py_file + # - install_project + # - python_interpreter + # - use_system_packages + requirements_file = models.CharField( + _("Requirements file"), + max_length=255, + default=None, + null=True, + blank=True, + help_text=_( + "A ' + "pip requirements file needed to build your documentation. " + "Path from the root of your project.", + ), + ) + documentation_type = models.CharField( + _("Documentation type"), + max_length=20, + choices=constants.DOCUMENTATION_CHOICES, + default="sphinx", + help_text=_( + 'Type of documentation you are building. More info on sphinx builders.', + ), + ) + enable_epub_build = models.BooleanField( + _("Enable EPUB build"), + default=False, + help_text=_( + "Create a EPUB version of your documentation with each build.", + ), + ) + enable_pdf_build = models.BooleanField( + _("Enable PDF build"), + default=False, + help_text=_( + "Create a PDF version of your documentation with each build.", + ), + ) + path = models.CharField( + _("Path"), + max_length=255, + editable=False, + help_text=_( + "The directory where conf.py lives", + ), + ) + conf_py_file = models.CharField( + _("Python configuration file"), + max_length=255, + default="", + blank=True, + help_text=_( + "Path from project root to conf.py file " + "(ex. docs/conf.py). " + "Leave blank if you want us to find it for you.", + ), + ) + install_project = models.BooleanField( + _("Install Project"), + help_text=_( + "Install your project inside a virtualenv using setup.py " + "install", + ), + default=False, + ) + python_interpreter = models.CharField( + _("Python Interpreter"), + max_length=20, + choices=constants.PYTHON_CHOICES, + default="python3", + help_text=_( + "The Python interpreter used to create the virtual environment.", + ), + ) + use_system_packages = models.BooleanField( + _("Use system packages"), + help_text=_( + "Give the virtual environment access to the global site-packages dir.", + ), + default=False, + ) + # Property used for storing the latest build for a project when prefetching LATEST_BUILD_CACHE = '_latest_build' diff --git a/readthedocs/projects/tasks/builds.py b/readthedocs/projects/tasks/builds.py index 13e1f0d74db..19a0e508a85 100644 --- a/readthedocs/projects/tasks/builds.py +++ b/readthedocs/projects/tasks/builds.py @@ -38,7 +38,7 @@ from readthedocs.builds.signals import build_complete from readthedocs.builds.utils import memcache_lock from readthedocs.config import ConfigError -from readthedocs.config.config import BuildConfigV1, BuildConfigV2 +from readthedocs.config.config import BuildConfigV2 from readthedocs.doc_builder.director import BuildDirector from readthedocs.doc_builder.environments import ( DockerBuildEnvironment, @@ -116,7 +116,7 @@ class TaskData: start_time: timezone.datetime = None environment_class: type[DockerBuildEnvironment] | type[LocalBuildEnvironment] = None build_director: BuildDirector = None - config: BuildConfigV1 | BuildConfigV2 = None + config: BuildConfigV2 = None project: APIProject = None version: APIVersion = None diff --git a/readthedocs/projects/tasks/utils.py b/readthedocs/projects/tasks/utils.py index 8c9c6d6ce1a..45056023204 100644 --- a/readthedocs/projects/tasks/utils.py +++ b/readthedocs/projects/tasks/utils.py @@ -6,12 +6,9 @@ import structlog from celery.worker.request import Request from django.conf import settings -from django.contrib.auth.models import User -from django.db.models import Q, Sum +from django.db.models import Q from django.utils import timezone from django.utils.translation import gettext_lazy as _ -from djstripe.enums import SubscriptionStatus -from messages_extends.constants import WARNING_PERSISTENT from readthedocs.builds.constants import ( BUILD_FINAL_STATES, @@ -20,12 +17,7 @@ ) from readthedocs.builds.models import Build from readthedocs.builds.tasks import send_build_status -from readthedocs.core.permissions import AdminPermission from readthedocs.core.utils.filesystem import safe_rmtree -from readthedocs.notifications import Notification, SiteNotification -from readthedocs.notifications.backends import EmailBackend -from readthedocs.notifications.constants import REQUIREMENT -from readthedocs.projects.models import Project from readthedocs.storage import build_media_storage from readthedocs.worker import app @@ -170,316 +162,6 @@ def send_external_build_status(version_type, build_pk, commit, status): send_build_status.delay(build_pk, commit, status) -class DeprecatedConfigFileSiteNotification(SiteNotification): - # TODO: mention all the project slugs here - # Maybe trim them to up to 5 projects to avoid sending a huge blob of text - failure_message = _( - 'Your project(s) "{{ project_slugs }}" don\'t have a configuration file. ' - "Configuration files will soon be required by projects, " - "and will no longer be optional. " - 'Read our blog post to create one ' # noqa - "and ensure your project continues building successfully." - ) - failure_level = WARNING_PERSISTENT - - -class DeprecatedConfigFileEmailNotification(Notification): - app_templates = "projects" - name = "deprecated_config_file_used" - subject = "[Action required] Add a configuration file to your project to prevent build failures" - level = REQUIREMENT - - def send(self): - """Method overwritten to remove on-site backend.""" - backend = EmailBackend(self.request) - backend.send(self) - - -@app.task(queue="web") -def deprecated_config_file_used_notification(): - """ - Create a notification about not using a config file for all the maintainers of the project. - - This is a scheduled task to be executed on the webs. - Note the code uses `.iterator` and `.only` to avoid killing the db with this query. - Besdies, it excludes projects with enough spam score to be skipped. - """ - # Skip projects with a spam score bigger than this value. - # Currently, this gives us ~250k in total (from ~550k we have in our database) - spam_score = 300 - - projects = set() - start_datetime = datetime.datetime.now() - queryset = Project.objects.exclude(users__profile__banned=True) - if settings.ALLOW_PRIVATE_REPOS: - # Only send emails to active customers - queryset = queryset.filter( - organizations__stripe_subscription__status=SubscriptionStatus.active - ) - else: - # Take into account spam score on community - queryset = queryset.annotate(spam_score=Sum("spam_rules__value")).filter( - Q(spam_score__lt=spam_score) | Q(is_spam=False) - ) - queryset = queryset.only("slug", "default_version").order_by("id") - n_projects = queryset.count() - - for i, project in enumerate(queryset.iterator()): - if i % 500 == 0: - log.info( - "Finding projects without a configuration file.", - progress=f"{i}/{n_projects}", - current_project_pk=project.pk, - current_project_slug=project.slug, - projects_found=len(projects), - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - # Only check for the default version because if the project is using tags - # they won't be able to update those and we will send them emails forever. - # We can update this query if we consider later. - version = ( - project.versions.filter(slug=project.default_version).only("id").first() - ) - if version: - # Use a fixed date here to avoid changing the date on each run - years_ago = timezone.datetime(2022, 6, 1) - build = ( - version.builds.filter(success=True, date__gt=years_ago) - .only("_config") - .order_by("-date") - .first() - ) - if build and build.deprecated_config_used(): - projects.add(project.slug) - - # Store all the users we want to contact - users = set() - - n_projects = len(projects) - queryset = Project.objects.filter(slug__in=projects).order_by("id") - for i, project in enumerate(queryset.iterator()): - if i % 500 == 0: - log.info( - "Querying all the users we want to contact.", - progress=f"{i}/{n_projects}", - current_project_pk=project.pk, - current_project_slug=project.slug, - users_found=len(users), - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - users.update(AdminPermission.owners(project).values_list("username", flat=True)) - - # Only send 1 email per user, - # even if that user has multiple projects without a configuration file. - # The notification will mention all the projects. - queryset = User.objects.filter( - username__in=users, - profile__banned=False, - profile__optout_email_config_file_deprecation=False, - ).order_by("id") - - n_users = queryset.count() - for i, user in enumerate(queryset.iterator()): - if i % 500 == 0: - log.info( - "Sending deprecated config file notification to users.", - progress=f"{i}/{n_users}", - current_user_pk=user.pk, - current_user_username=user.username, - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - # All the projects for this user that don't have a configuration file - # Use set() intersection in Python that's pretty quick since we only need the slugs. - # Otherwise we have to pass 82k slugs to the DB query, which makes it pretty slow. - user_projects = AdminPermission.projects(user, admin=True).values_list( - "slug", flat=True - ) - user_projects_slugs = list(set(user_projects) & projects) - user_projects = Project.objects.filter(slug__in=user_projects_slugs) - - # Create slug string for onsite notification - user_project_slugs = ", ".join(user_projects_slugs[:5]) - if len(user_projects) > 5: - user_project_slugs += " and others..." - - n_site = DeprecatedConfigFileSiteNotification( - user=user, - context_object=user, - extra_context={"project_slugs": user_project_slugs}, - success=False, - ) - n_site.send() - - n_email = DeprecatedConfigFileEmailNotification( - user=user, - context_object=user, - extra_context={"projects": user_projects}, - ) - n_email.send() - - log.info( - "Finish sending deprecated config file notifications.", - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - -class DeprecatedBuildImageSiteNotification(SiteNotification): - failure_message = _( - 'Your project(s) "{{ project_slugs }}" are using the deprecated "build.image" ' - 'config on their ".readthedocs.yaml" file. ' - 'This config is deprecated in favor of "build.os" and will be removed on October 16, 2023. ' # noqa - 'Read our blog post to migrate to "build.os" ' # noqa - "and ensure your project continues building successfully." - ) - failure_level = WARNING_PERSISTENT - - -class DeprecatedBuildImageEmailNotification(Notification): - app_templates = "projects" - name = "deprecated_build_image_used" - subject = '[Action required] Update your ".readthedocs.yaml" file to use "build.os"' - level = REQUIREMENT - - def send(self): - """Method overwritten to remove on-site backend.""" - backend = EmailBackend(self.request) - backend.send(self) - - -@app.task(queue="web") -def deprecated_build_image_notification(): - """ - Send an email notification about using "build.image" to all maintainers of the project. - - This is a scheduled task to be executed on the webs. - Note the code uses `.iterator` and `.only` to avoid killing the db with this query. - Besdies, it excludes projects with enough spam score to be skipped. - """ - # Skip projects with a spam score bigger than this value. - # Currently, this gives us ~250k in total (from ~550k we have in our database) - spam_score = 300 - - projects = set() - start_datetime = datetime.datetime.now() - queryset = Project.objects.exclude(users__profile__banned=True) - if settings.ALLOW_PRIVATE_REPOS: - # Only send emails to active customers - queryset = queryset.filter( - organizations__stripe_subscription__status=SubscriptionStatus.active - ) - else: - # Take into account spam score on community - queryset = queryset.annotate(spam_score=Sum("spam_rules__value")).filter( - Q(spam_score__lt=spam_score) | Q(is_spam=False) - ) - queryset = queryset.only("slug", "default_version").order_by("id") - n_projects = queryset.count() - - for i, project in enumerate(queryset.iterator()): - if i % 500 == 0: - log.info( - 'Finding projects using "build.image" config key.', - progress=f"{i}/{n_projects}", - current_project_pk=project.pk, - current_project_slug=project.slug, - projects_found=len(projects), - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - # Only check for the default version because if the project is using tags - # they won't be able to update those and we will send them emails forever. - # We can update this query if we consider later. - version = ( - project.versions.filter(slug=project.default_version).only("id").first() - ) - if version: - # Use a fixed date here to avoid changing the date on each run - years_ago = timezone.datetime(2022, 8, 1) - build = ( - version.builds.filter(success=True, date__gt=years_ago) - .only("_config") - .order_by("-date") - .first() - ) - if build and build.deprecated_build_image_used(): - projects.add(project.slug) - - # Store all the users we want to contact - users = set() - - n_projects = len(projects) - queryset = Project.objects.filter(slug__in=projects).order_by("id") - for i, project in enumerate(queryset.iterator()): - if i % 500 == 0: - log.info( - 'Querying all the users we want to contact about "build.image" deprecation.', - progress=f"{i}/{n_projects}", - current_project_pk=project.pk, - current_project_slug=project.slug, - users_found=len(users), - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - users.update(AdminPermission.owners(project).values_list("username", flat=True)) - - # Only send 1 email per user, - # even if that user has multiple projects using "build.image". - # The notification will mention all the projects. - queryset = User.objects.filter( - username__in=users, - profile__banned=False, - profile__optout_email_build_image_deprecation=False, - ).order_by("id") - - n_users = queryset.count() - for i, user in enumerate(queryset.iterator()): - if i % 500 == 0: - log.info( - 'Sending deprecated "build.image" config key notification to users.', - progress=f"{i}/{n_users}", - current_user_pk=user.pk, - current_user_username=user.username, - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - # All the projects for this user that are using "build.image". - # Use set() intersection in Python that's pretty quick since we only need the slugs. - # Otherwise we have to pass 82k slugs to the DB query, which makes it pretty slow. - user_projects = AdminPermission.projects(user, admin=True).values_list( - "slug", flat=True - ) - user_projects_slugs = list(set(user_projects) & projects) - user_projects = Project.objects.filter(slug__in=user_projects_slugs) - - # Create slug string for onsite notification - user_project_slugs = ", ".join(user_projects_slugs[:5]) - if len(user_projects) > 5: - user_project_slugs += " and others..." - - n_site = DeprecatedBuildImageSiteNotification( - user=user, - context_object=user, - extra_context={"project_slugs": user_project_slugs}, - success=False, - ) - n_site.send() - - n_email = DeprecatedBuildImageEmailNotification( - user=user, - context_object=user, - extra_context={"projects": user_projects}, - ) - n_email.send() - - log.info( - 'Finish sending deprecated "build.image" config key notifications.', - time_elapsed=(datetime.datetime.now() - start_datetime).seconds, - ) - - @app.task(queue="web") def set_builder_scale_in_protection(instance, protected_from_scale_in): """ diff --git a/readthedocs/projects/tests/test_build_tasks.py b/readthedocs/projects/tests/test_build_tasks.py index 9ebadd56271..de60cd4d5f3 100644 --- a/readthedocs/projects/tests/test_build_tasks.py +++ b/readthedocs/projects/tests/test_build_tasks.py @@ -1,5 +1,6 @@ import os import pathlib +import textwrap from unittest import mock import django_dynamic_fixture as fixture @@ -14,6 +15,7 @@ from readthedocs.builds.models import Build from readthedocs.config import ALL, ConfigError from readthedocs.config.config import BuildConfigV2 +from readthedocs.config.tests.test_config import get_build_config from readthedocs.doc_builder.exceptions import BuildAppError from readthedocs.projects.exceptions import RepositoryError from readthedocs.projects.models import EnvironmentVariable, Project, WebHookEvent @@ -71,16 +73,6 @@ def _trigger_update_docs_task(self): build_commit=self.build.commit, ) - def _config_file(self, config): - config = BuildConfigV2( - {}, - config, - source_file="readthedocs.yaml", - ) - config.validate() - return config - - class TestCustomConfigFile(BuildEnvironmentBase): # Relative path to where a custom config file is assumed to exist in repo @@ -95,15 +87,6 @@ def _get_project(self): readthedocs_yaml_path=self.config_file_name, ) - def _config_file(self, config): - config = BuildConfigV2( - {}, - config, - source_file=self.config_file_name, - ) - config.validate() - return config - @mock.patch("readthedocs.doc_builder.director.load_yaml_config") @mock.patch("readthedocs.doc_builder.director.BuildDirector.build_docs_class") def test_config_is_stored(self, build_docs_class, load_yaml_config): @@ -111,14 +94,16 @@ def test_config_is_stored(self, build_docs_class, load_yaml_config): # We add the PDF format to this config so we can check that the # config file is in use - config = self._config_file( + config = get_build_config( { "version": 2, "formats": ["pdf"], "sphinx": { "configuration": "docs/conf.py", }, - } + }, + source_file=self.config_file_name, + validate=True, ) load_yaml_config.return_value = config build_docs_class.return_value = True # success @@ -150,10 +135,18 @@ def test_config_file_is_loaded( self.mocker.add_file_in_repo_checkout( self.config_file_name, - "version: 2\n" - "formats: [pdf]\n" - "sphinx:\n" - " configuration: docs/conf.py", + textwrap.dedent( + """ + version: 2 + build: + os: "ubuntu-22.04" + tools: + python: "3" + formats: [pdf] + sphinx: + configuration: docs/conf.py + """ + ), ) self._trigger_update_docs_task() @@ -178,14 +171,15 @@ class TestBuildTask(BuildEnvironmentBase): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") @pytest.mark.skip def test_build_sphinx_formats(self, load_yaml_config, formats, builders): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "formats": formats, "sphinx": { "configuration": "docs/conf.py", }, - } + }, + validate=True, ) self._trigger_update_docs_task() @@ -236,11 +230,12 @@ def test_build_sphinx_formats(self, load_yaml_config, formats, builders): def test_build_formats_only_html_for_external_versions( self, build_docs_class, load_yaml_config ): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "formats": "all", - } + }, + validate=True, ) build_docs_class.return_value = True @@ -255,14 +250,15 @@ def test_build_formats_only_html_for_external_versions( @mock.patch("readthedocs.doc_builder.director.load_yaml_config") @mock.patch("readthedocs.doc_builder.director.BuildDirector.build_docs_class") def test_build_respects_formats_mkdocs(self, build_docs_class, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "mkdocs": { "configuration": "mkdocs.yml", }, "formats": ["epub", "pdf"], - } + }, + validate=True, ) self._trigger_update_docs_task() @@ -272,14 +268,15 @@ def test_build_respects_formats_mkdocs(self, build_docs_class, load_yaml_config) @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_build_updates_documentation_type(self, load_yaml_config): assert self.version.documentation_type == "sphinx" - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "mkdocs": { "configuration": "mkdocs.yml", }, "formats": ["epub", "pdf"], - } + }, + validate=True, ) # Create the artifact paths, so that `store_build_artifacts` @@ -341,7 +338,7 @@ def test_build_updates_documentation_type(self, load_yaml_config): @mock.patch("readthedocs.projects.tasks.builds.LocalBuildEnvironment") @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_get_env_vars(self, load_yaml_config, build_environment, config, external): - load_yaml_config.return_value = self._config_file(config) + load_yaml_config.return_value = get_build_config(config, validate=True) if external: self.version.type = EXTERNAL @@ -422,14 +419,14 @@ def test_successful_build( build_complete, index_build, ): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { - "version": 2, "formats": "all", "sphinx": { "configuration": "docs/conf.py", }, - } + }, + validate=True, ) assert not BuildData.objects.all().exists() @@ -508,12 +505,30 @@ def test_successful_build( "version": "2", "formats": ["htmlzip", "pdf", "epub"], "python": { - "version": "3", "install": [], }, "conda": None, "build": { - "image": "readthedocs/build:latest", + "os": "ubuntu-22.04", + "commands": [], + "jobs": { + "post_build": [], + "post_checkout": [], + "post_create_environment": [], + "post_install": [], + "post_system_dependencies": [], + "pre_build": [], + "pre_checkout": [], + "pre_create_environment": [], + "pre_install": [], + "pre_system_dependencies": [], + }, + "tools": { + "python": { + "full_version": "3.12.0", + "version": "3", + } + }, "apt_packages": [], }, "doctype": "sphinx", @@ -680,14 +695,15 @@ def test_build_commands_executed( self, load_yaml_config, ): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "formats": "all", "sphinx": { "configuration": "docs/conf.py", }, - } + }, + validate=True, ) # Create the artifact paths, so it's detected by the builder @@ -737,17 +753,29 @@ def test_build_commands_executed( ] ) + python_version = settings.RTD_DOCKER_BUILD_SETTINGS["tools"]["python"]["3"] self.mocker.mocks["environment.run"].assert_has_calls( [ mock.call( "cat", - "readthedocs.yaml", + "readthedocs.yml", cwd="/tmp/readthedocs-tests/git-repository", ), + mock.call("asdf", "install", "python", python_version), + mock.call("asdf", "global", "python", python_version), + mock.call("asdf", "reshim", "python", record=False), + mock.call( + "python", + "-mpip", + "install", + "-U", + "virtualenv", + "setuptools", + ), mock.call( - "python3.7", + "python", "-mvirtualenv", - mock.ANY, + "$READTHEDOCS_VIRTUALENV_PATH", bin_path=None, cwd=None, ), @@ -770,14 +798,8 @@ def test_build_commands_executed( "install", "--upgrade", "--no-cache-dir", - "pillow", - "mock==1.0.1", - "alabaster>=0.7,<0.8,!=0.7.5", - "commonmark==0.9.1", - "recommonmark==0.5.0", "sphinx", - "sphinx-rtd-theme", - "readthedocs-sphinx-ext<2.3", + "readthedocs-sphinx-ext", bin_path=mock.ANY, cwd=mock.ANY, ), @@ -954,10 +976,13 @@ def test_build_commands_executed( @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_install_apt_packages(self, load_yaml_config): config = BuildConfigV2( - {}, { "version": 2, "build": { + "os": "ubuntu-22.04", + "tools": { + "python": "3", + }, "apt_packages": [ "clangd", "cmatrix", @@ -996,7 +1021,6 @@ def test_install_apt_packages(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_build_tools(self, load_yaml_config): config = BuildConfigV2( - {}, { "version": 2, "build": { @@ -1049,7 +1073,6 @@ def test_build_tools(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_build_jobs(self, load_yaml_config): config = BuildConfigV2( - {}, { "version": 2, "build": { @@ -1084,7 +1107,6 @@ def test_build_jobs(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_build_tools_cached(self, load_yaml_config, build_tools_storage, tarfile): config = BuildConfigV2( - {}, { "version": 2, "build": { @@ -1155,7 +1177,6 @@ def test_build_tools_cached(self, load_yaml_config, build_tools_storage, tarfile @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_build_commands(self, load_yaml_config): config = BuildConfigV2( - {}, { "version": 2, "build": { @@ -1216,7 +1237,7 @@ def test_build_commands(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_requirements_from_config_file_installed(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "python": { @@ -1227,6 +1248,7 @@ def test_requirements_from_config_file_installed(self, load_yaml_config): ], }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1250,13 +1272,20 @@ def test_requirements_from_config_file_installed(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_conda_config_calls_conda_command(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, + "build": { + "os": "ubuntu-22.04", + "tools": { + "python": "miniconda3-4.7", + }, + }, "conda": { "environment": "environment.yaml", }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1264,8 +1293,15 @@ def test_conda_config_calls_conda_command(self, load_yaml_config): # TODO: check we are saving the `conda.environment` in the config file # via the API call + python_version = settings.RTD_DOCKER_BUILD_SETTINGS["tools"]["python"][ + "miniconda3-4.7" + ] self.mocker.mocks["environment.run"].assert_has_calls( [ + mock.call("cat", "readthedocs.yml", cwd=mock.ANY), + mock.call("asdf", "install", "python", python_version), + mock.call("asdf", "global", "python", python_version), + mock.call("asdf", "reshim", "python", record=False), mock.call( "conda", "env", @@ -1285,10 +1321,7 @@ def test_conda_config_calls_conda_command(self, load_yaml_config): "--quiet", "--name", self.version.slug, - "mock", - "pillow", "sphinx", - "sphinx_rtd_theme", cwd=mock.ANY, ), mock.call( @@ -1298,17 +1331,48 @@ def test_conda_config_calls_conda_command(self, load_yaml_config): "install", "-U", "--no-cache-dir", - "recommonmark", "readthedocs-sphinx-ext", cwd=mock.ANY, bin_path=mock.ANY, ), - ] + mock.call("test", "-x", "_build/html", cwd=mock.ANY, record=False), + mock.call("lsb_release", "--description", record=False, demux=True), + mock.call("python", "--version", record=False, demux=True), + mock.call( + "dpkg-query", + "--showformat", + "${package} ${version}\\n", + "--show", + record=False, + demux=True, + ), + mock.call( + "conda", + "list", + "--json", + "--name", + "latest", + record=False, + demux=True, + ), + mock.call( + "python", + "-m", + "pip", + "list", + "--pre", + "--local", + "--format", + "json", + record=False, + demux=True, + ), + ], ) @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_python_mamba_commands(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "build": { @@ -1321,12 +1385,14 @@ def test_python_mamba_commands(self, load_yaml_config): "environment": "environment.yaml", }, }, + validate=True, ) self._trigger_update_docs_task() self.mocker.mocks["environment.run"].assert_has_calls( [ + mock.call("cat", "readthedocs.yml", cwd=mock.ANY), mock.call("asdf", "install", "python", "mambaforge-4.10.3-10"), mock.call("asdf", "global", "python", "mambaforge-4.10.3-10"), mock.call("asdf", "reshim", "python", record=False), @@ -1349,10 +1415,18 @@ def test_python_mamba_commands(self, load_yaml_config): "--quiet", "--name", "latest", - "mock", - "pillow", "sphinx", - "sphinx_rtd_theme", + cwd=mock.ANY, + ), + mock.call( + mock.ANY, + "-m", + "pip", + "install", + "-U", + "--no-cache-dir", + "readthedocs-sphinx-ext", + bin_path=mock.ANY, cwd=mock.ANY, ), ] @@ -1360,14 +1434,14 @@ def test_python_mamba_commands(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_sphinx_normalized_language(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { - "version": 2, "sphinx": { "configuration": "docs/conf.py", "fail_on_warning": True, }, }, + validate=True, ) self.project.language = "es-mx" self.project.save() @@ -1400,7 +1474,7 @@ def test_sphinx_normalized_language(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_sphinx_fail_on_warning(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "sphinx": { @@ -1408,6 +1482,7 @@ def test_sphinx_fail_on_warning(self, load_yaml_config): "fail_on_warning": True, }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1438,7 +1513,7 @@ def test_sphinx_fail_on_warning(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_mkdocs_fail_on_warning(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "mkdocs": { @@ -1446,6 +1521,7 @@ def test_mkdocs_fail_on_warning(self, load_yaml_config): "fail_on_warning": True, }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1471,7 +1547,7 @@ def test_mkdocs_fail_on_warning(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_python_install_setuptools(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "python": { @@ -1483,6 +1559,7 @@ def test_python_install_setuptools(self, load_yaml_config): ], }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1502,7 +1579,7 @@ def test_python_install_setuptools(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_python_install_pip(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "python": { @@ -1514,6 +1591,7 @@ def test_python_install_pip(self, load_yaml_config): ], }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1542,7 +1620,7 @@ def test_python_install_pip_extras(self, load_yaml_config): # `backends/sphinx.py` not finding a file. # # TypeError('expected str, bytes or os.PathLike object, not NoneType') - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "python": { @@ -1555,6 +1633,7 @@ def test_python_install_pip_extras(self, load_yaml_config): ], }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1579,7 +1658,7 @@ def test_python_install_pip_extras(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_python_install_pip_several_options(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "python": { @@ -1599,6 +1678,7 @@ def test_python_install_pip_several_options(self, load_yaml_config): ], }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1650,13 +1730,14 @@ def test_python_install_pip_several_options(self, load_yaml_config): ) @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_submodules_include(self, load_yaml_config, value, expected): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "submodules": { "include": value, }, }, + validate=True, ) self._trigger_update_docs_task() @@ -1672,11 +1753,12 @@ def test_submodules_include(self, load_yaml_config, value, expected): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_submodules_exclude(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "submodules": {"exclude": ["one"], "recursive": True}, }, + validate=True, ) self._trigger_update_docs_task() @@ -1700,11 +1782,12 @@ def test_submodules_exclude(self, load_yaml_config): @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_submodules_exclude_all(self, load_yaml_config): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "submodules": {"exclude": ALL, "recursive": True}, }, + validate=True, ) self._trigger_update_docs_task() @@ -1728,7 +1811,7 @@ def test_submodules_exclude_all(self, load_yaml_config): ) @mock.patch("readthedocs.doc_builder.director.load_yaml_config") def test_sphinx_builder(self, load_yaml_config, value, command): - load_yaml_config.return_value = self._config_file( + load_yaml_config.return_value = get_build_config( { "version": 2, "sphinx": { @@ -1736,6 +1819,7 @@ def test_sphinx_builder(self, load_yaml_config, value, command): "configuration": "docs/conf.py", }, }, + validate=True, ) self._trigger_update_docs_task() diff --git a/readthedocs/rtd_tests/tests/test_config_integration.py b/readthedocs/rtd_tests/tests/test_config_integration.py deleted file mode 100644 index 3f5ee432fd6..00000000000 --- a/readthedocs/rtd_tests/tests/test_config_integration.py +++ /dev/null @@ -1,321 +0,0 @@ -import tempfile -from os import path -from unittest import mock - -import yaml -from django.test import TestCase, override_settings -from django_dynamic_fixture import get - -from readthedocs.builds.models import Version -from readthedocs.config import SETUPTOOLS, BuildConfigV1, InvalidConfig -from readthedocs.config.models import PythonInstallRequirements -from readthedocs.doc_builder.config import load_yaml_config -from readthedocs.doc_builder.constants import DOCKER_IMAGE_SETTINGS -from readthedocs.projects.models import Project - - -def create_load(config=None): - """ - Mock out the function of the build load function. - - This will create a BuildConfigV1 object and validate it. - """ - if config is None: - config = {} - - def inner(path=None, env_config=None, readthedocs_yaml_path=None): - env_config_defaults = { - 'output_base': '', - 'name': '1', - } - if env_config is not None: - env_config_defaults.update(env_config) - yaml_config = BuildConfigV1( - env_config_defaults, - config, - source_file='readthedocs.yml', - ) - yaml_config.validate() - return yaml_config - - return inner - - -def create_config_file(config, file_name='readthedocs.yml', base_path=None): - """ - Creates a readthedocs configuration file with name - ``file_name`` in ``base_path``. If ``base_path`` is not given - a temporal directory is created. - """ - if not base_path: - base_path = tempfile.mkdtemp() - full_path = path.join(base_path, file_name) - yaml.safe_dump(config, open(full_path, 'w')) - return full_path - - -class LoadConfigTests(TestCase): - - def setUp(self): - self.project = get( - Project, - main_language_project=None, - install_project=False, - container_image=None, - ) - self.version = get(Version, project=self.project) - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_supported_versions_default_image_1_0(self, load_config): - load_config.side_effect = create_load() - self.project.container_image = 'readthedocs/build:1.0' - self.project.enable_epub_build = True - self.project.enable_pdf_build = True - self.project.save() - config = load_yaml_config(self.version) - - expected_env_config = { - 'build': {'image': 'readthedocs/build:1.0'}, - 'defaults': { - 'install_project': self.project.install_project, - 'formats': [ - 'htmlzip', - 'epub', - 'pdf' - ], - 'requirements_file': self.project.requirements_file, - 'python_version': '3', - 'sphinx_configuration': mock.ANY, - 'build_image': 'readthedocs/build:1.0', - 'doctype': self.project.documentation_type, - }, - } - - img_settings = DOCKER_IMAGE_SETTINGS.get(self.project.container_image, None) - if img_settings: - expected_env_config.update(img_settings) - - load_config.assert_called_once_with( - path=mock.ANY, - env_config=expected_env_config, - readthedocs_yaml_path=None, - ) - self.assertEqual(config.python.version, '3') - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_supported_versions_image_2_0(self, load_config): - load_config.side_effect = create_load() - self.project.container_image = 'readthedocs/build:2.0' - self.project.save() - config = load_yaml_config(self.version) - self.assertEqual( - config.get_valid_python_versions(), - ['2', '2.7', '3', '3.5'], - ) - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_supported_versions_image_latest(self, load_config): - load_config.side_effect = create_load() - self.project.container_image = 'readthedocs/build:latest' - self.project.save() - config = load_yaml_config(self.version) - self.assertEqual( - config.get_valid_python_versions(), - ["2", "2.7", "3", "3.5", "3.6", "3.7", "3.8"], - ) - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_default_version(self, load_config): - load_config.side_effect = create_load() - config = load_yaml_config(self.version) - self.assertEqual(config.python.version, '3') - self.assertEqual(config.python_interpreter, 'python3.7') - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_set_python_version_on_project(self, load_config): - load_config.side_effect = create_load() - self.project.container_image = 'readthedocs/build:2.0' - self.project.python_interpreter = 'python3' - self.project.save() - config = load_yaml_config(self.version) - self.assertEqual(config.python.version, '3') - self.assertEqual(config.python_interpreter, 'python3.5') - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_set_python_version_in_config(self, load_config): - load_config.side_effect = create_load({ - 'python': {'version': 3.5}, - }) - self.project.container_image = 'readthedocs/build:2.0' - self.project.save() - config = load_yaml_config(self.version) - self.assertEqual(config.python.version, '3.5') - self.assertEqual(config.python_interpreter, 'python3.5') - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_set_python_310_version_in_config(self, load_config): - load_config.side_effect = create_load({ - 'build': {'image': 'testing'}, - 'python': {'version': '3.10'}, - }) - config = load_yaml_config(self.version) - self.assertEqual(config.python.version, '3.10') - self.assertEqual(config.python_interpreter, 'python3.10') - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_python_invalid_version_in_config(self, load_config): - load_config.side_effect = create_load({ - 'python': {'version': 2.6}, - }) - self.project.container_image = 'readthedocs/build:2.0' - self.project.save() - with self.assertRaises(InvalidConfig): - load_yaml_config(self.version) - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_install_project(self, load_config): - load_config.side_effect = create_load() - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 1) - self.assertTrue( - isinstance(config.python.install[0], PythonInstallRequirements) - ) - - load_config.side_effect = create_load({ - 'python': {'setup_py_install': True}, - }) - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 2) - self.assertTrue( - isinstance(config.python.install[0], PythonInstallRequirements) - ) - self.assertEqual( - config.python.install[1].method, - SETUPTOOLS - ) - - @mock.patch('readthedocs.doc_builder.config.load_config') - def test_extra_requirements(self, load_config): - load_config.side_effect = create_load({ - 'python': { - 'pip_install': True, - 'extra_requirements': ['tests', 'docs'], - }, - }) - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 2) - self.assertTrue( - isinstance(config.python.install[0], PythonInstallRequirements) - ) - self.assertEqual( - config.python.install[1].extra_requirements, - ['tests', 'docs'] - ) - - load_config.side_effect = create_load({ - 'python': { - 'extra_requirements': ['tests', 'docs'], - }, - }) - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 1) - self.assertTrue( - isinstance(config.python.install[0], PythonInstallRequirements) - ) - - load_config.side_effect = create_load() - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 1) - self.assertTrue( - isinstance(config.python.install[0], PythonInstallRequirements) - ) - - load_config.side_effect = create_load({ - 'python': { - 'setup_py_install': True, - 'extra_requirements': ['tests', 'docs'], - }, - }) - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 2) - self.assertTrue( - isinstance(config.python.install[0], PythonInstallRequirements) - ) - self.assertEqual( - config.python.install[1].extra_requirements, - [] - ) - - @mock.patch('readthedocs.projects.models.Project.checkout_path') - def test_conda_with_cofig(self, checkout_path): - base_path = tempfile.mkdtemp() - checkout_path.return_value = base_path - conda_file = 'environmemt.yml' - full_conda_file = path.join(base_path, conda_file) - with open(full_conda_file, 'w') as f: - f.write('conda') - create_config_file( - { - 'conda': { - 'file': conda_file, - }, - }, - base_path=base_path, - ) - with override_settings(DOCROOT=base_path): - config = load_yaml_config(self.version) - self.assertTrue(config.conda is not None) - self.assertEqual(config.conda.environment, conda_file) - - @mock.patch('readthedocs.projects.models.Project.checkout_path') - def test_conda_without_cofig(self, checkout_path): - base_path = tempfile.mkdtemp() - checkout_path.return_value = base_path - config = load_yaml_config(self.version) - self.assertIsNone(config.conda) - - @mock.patch('readthedocs.projects.models.Project.checkout_path') - def test_requirements_file_from_project_setting(self, checkout_path): - base_path = tempfile.mkdtemp() - checkout_path.return_value = base_path - - requirements_file = 'requirements.txt' - self.project.requirements_file = requirements_file - self.project.save() - - full_requirements_file = path.join(base_path, requirements_file) - with open(full_requirements_file, 'w') as f: - f.write('pip') - - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 1) - self.assertEqual( - config.python.install[0].requirements, - requirements_file - ) - - @mock.patch('readthedocs.projects.models.Project.checkout_path') - def test_requirements_file_from_yml(self, checkout_path): - base_path = tempfile.mkdtemp() - checkout_path.return_value = base_path - - self.project.requirements_file = 'no-existent-file.txt' - self.project.save() - - requirements_file = 'requirements.txt' - full_requirements_file = path.join(base_path, requirements_file) - with open(full_requirements_file, 'w') as f: - f.write('pip') - create_config_file( - { - 'requirements_file': requirements_file, - }, - base_path=base_path, - ) - with override_settings(DOCROOT=base_path): - config = load_yaml_config(self.version) - self.assertEqual(len(config.python.install), 1) - self.assertEqual( - config.python.install[0].requirements, - requirements_file - ) diff --git a/readthedocs/rtd_tests/tests/test_doc_builder.py b/readthedocs/rtd_tests/tests/test_doc_builder.py index ede87072177..23fae392bc9 100644 --- a/readthedocs/rtd_tests/tests/test_doc_builder.py +++ b/readthedocs/rtd_tests/tests/test_doc_builder.py @@ -11,24 +11,18 @@ from django_dynamic_fixture import get from readthedocs.builds.models import Version +from readthedocs.config.tests.test_config import get_build_config from readthedocs.doc_builder.backends.mkdocs import ( MkdocsHTML, SafeDumper, yaml_load_safely, ) -from readthedocs.doc_builder.backends.sphinx import ( - BaseSphinx, - HtmlBuilder, - HtmlDirBuilder, - SingleHtmlBuilder, -) -from readthedocs.doc_builder.config import load_yaml_config +from readthedocs.doc_builder.backends.sphinx import BaseSphinx from readthedocs.doc_builder.environments import LocalBuildEnvironment from readthedocs.doc_builder.exceptions import MkDocsYAMLParseError from readthedocs.doc_builder.python_environments import Virtualenv from readthedocs.projects.exceptions import ProjectConfigurationError from readthedocs.projects.models import Feature, Project -from readthedocs.rtd_tests.tests.test_config_integration import create_load @override_settings(PRODUCTION_DOMAIN="readthedocs.org") @@ -51,9 +45,10 @@ def setUp(self): BaseSphinx.sphinx_build_dir = tempfile.mkdtemp() BaseSphinx.relative_output_dir = "_readthedocs/" - @patch('readthedocs.doc_builder.backends.sphinx.BaseSphinx.docs_dir') - @patch('readthedocs.projects.models.Project.checkout_path') - def test_conf_py_path(self, checkout_path, docs_dir): + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.docs_dir") + @patch("readthedocs.projects.models.Project.checkout_path") + @patch("readthedocs.doc_builder.python_environments.load_yaml_config") + def test_conf_py_path(self, load_yaml_config, checkout_path, docs_dir): """ Test the conf_py_path that is added to the conf.py file. @@ -66,7 +61,7 @@ def test_conf_py_path(self, checkout_path, docs_dir): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config({}, validate=True), ) base_sphinx = BaseSphinx( build_env=self.build_env, @@ -83,13 +78,15 @@ def test_conf_py_path(self, checkout_path, docs_dir): expected, ) - @patch('readthedocs.doc_builder.backends.sphinx.BaseSphinx.docs_dir') - @patch('readthedocs.doc_builder.backends.sphinx.BaseSphinx.get_config_params') - @patch('readthedocs.doc_builder.backends.sphinx.BaseSphinx.run') - @patch('readthedocs.builds.models.Version.get_conf_py_path') - @patch('readthedocs.projects.models.Project.checkout_path') + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.docs_dir") + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.get_config_params") + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.run") + @patch("readthedocs.builds.models.Version.get_conf_py_path") + @patch("readthedocs.projects.models.Project.checkout_path") + @patch("readthedocs.doc_builder.python_environments.load_yaml_config") def test_project_without_conf_py( self, + load_yaml_config, checkout_path, get_conf_py_path, _, @@ -110,7 +107,7 @@ def test_project_without_conf_py( python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config({}, validate=True), ) base_sphinx = BaseSphinx( build_env=self.build_env, @@ -121,13 +118,20 @@ def test_project_without_conf_py( ): base_sphinx.append_conf() - @patch('readthedocs.doc_builder.backends.sphinx.BaseSphinx.docs_dir') - @patch('readthedocs.doc_builder.backends.sphinx.BaseSphinx.get_config_params') - @patch('readthedocs.doc_builder.backends.sphinx.BaseSphinx.run') - @patch('readthedocs.builds.models.Version.get_conf_py_path') - @patch('readthedocs.projects.models.Project.checkout_path') + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.docs_dir") + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.get_config_params") + @patch("readthedocs.doc_builder.backends.sphinx.BaseSphinx.run") + @patch("readthedocs.builds.models.Version.get_conf_py_path") + @patch("readthedocs.projects.models.Project.checkout_path") + @patch("readthedocs.doc_builder.python_environments.load_yaml_config") def test_multiple_conf_py( - self, checkout_path, get_conf_py_path, _, get_config_params, docs_dir + self, + load_yaml_config, + checkout_path, + get_conf_py_path, + _, + get_config_params, + docs_dir, ): """ Test for a project with multiple ``conf.py`` files. @@ -146,7 +150,7 @@ def test_multiple_conf_py( python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config({}, validate=True), ) base_sphinx = BaseSphinx( build_env=self.build_env, @@ -156,35 +160,6 @@ def test_multiple_conf_py( with override_settings(DOCROOT=tmp_docs_dir): base_sphinx.append_conf() - @mock.patch("readthedocs.doc_builder.config.load_config") - def test_use_sphinx_builders(self, load_config): - config_data = {"version": 2, "sphinx": {"configuration": "docs/conf.py"}} - load_config.side_effect = create_load(config_data) - config = load_yaml_config(self.version) - - python_env = Virtualenv( - version=self.version, - build_env=self.build_env, - config=config, - ) - builder = HtmlBuilder( - build_env=self.build_env, - python_env=python_env, - ) - self.assertEqual(builder.sphinx_builder, "html") - - builder = HtmlDirBuilder( - build_env=self.build_env, - python_env=python_env, - ) - self.assertEqual(builder.sphinx_builder, "dirhtml") - - builder = SingleHtmlBuilder( - build_env=self.build_env, - python_env=python_env, - ) - self.assertEqual(builder.sphinx_builder, "singlehtml") - @override_settings(PRODUCTION_DOMAIN='readthedocs.org') class MkdocsBuilderTest(TestCase): @@ -204,7 +179,9 @@ def test_get_theme_name(self, checkout_path): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) builder = MkdocsHTML( build_env=self.build_env, @@ -253,7 +230,9 @@ def test_get_theme_name_with_feature_flag(self, checkout_path, run): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) builder = MkdocsHTML( build_env=self.build_env, @@ -328,7 +307,9 @@ def test_append_conf_existing_yaml_on_root(self, checkout_path, run): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, @@ -378,7 +359,9 @@ def test_append_conf_existing_yaml_on_root_with_invalid_setting(self, checkout_p python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, @@ -411,7 +394,9 @@ def test_append_conf_and_none_values(self, checkout_path, run): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) builder = MkdocsHTML( build_env=self.build_env, @@ -465,7 +450,9 @@ def test_dont_override_theme(self, checkout_path, run): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, @@ -502,7 +489,9 @@ def test_write_js_data_docs_dir(self, checkout_path, run, generate_rtd_data): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, @@ -539,7 +528,9 @@ def test_write_js_data_on_invalid_docs_dir(self, checkout_path, generate_rtd_dat python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, @@ -570,7 +561,9 @@ def test_append_conf_existing_yaml_with_extra(self, checkout_path, run): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, @@ -613,7 +606,9 @@ def test_empty_yaml_config(self, checkout_path): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, @@ -640,7 +635,9 @@ def test_yaml_config_not_returns_dict(self, checkout_path): python_env = Virtualenv( version=self.version, build_env=self.build_env, - config=None, + config=get_build_config( + {"mkdocs": {"configuration": "mkdocs.yml"}}, validate=True + ), ) self.searchbuilder = MkdocsHTML( build_env=self.build_env, diff --git a/readthedocs/rtd_tests/tests/test_doc_building.py b/readthedocs/rtd_tests/tests/test_doc_building.py index 92c21b3b81f..3d63ae1b7a4 100644 --- a/readthedocs/rtd_tests/tests/test_doc_building.py +++ b/readthedocs/rtd_tests/tests/test_doc_building.py @@ -1,7 +1,5 @@ import os -import tempfile import uuid -from itertools import zip_longest from unittest import mock from unittest.mock import Mock, PropertyMock, patch @@ -18,9 +16,7 @@ LocalBuildEnvironment, ) from readthedocs.doc_builder.exceptions import BuildAppError -from readthedocs.doc_builder.python_environments import Conda, Virtualenv from readthedocs.projects.models import Project -from readthedocs.rtd_tests.mocks.paths import fake_paths_lookup DUMMY_BUILD_ID = 123 SAMPLE_UNICODE = 'HérÉ îß sömê ünïçó∂é' @@ -379,276 +375,3 @@ def test_command_oom_kill(self): 'Command killed due to timeout or excessive memory consumption\n', str(cmd.output), ) - - -class TestPythonEnvironment(TestCase): - - def setUp(self): - self.project_sphinx = get(Project, documentation_type='sphinx') - self.version_sphinx = get(Version, project=self.project_sphinx) - - self.project_mkdocs = get(Project, documentation_type='mkdocs') - self.version_mkdocs = get(Version, project=self.project_mkdocs) - - self.build_env_mock = Mock() - - self.base_requirements = [ - "pillow", - "mock", - "alabaster", - ] - self.base_conda_requirements = [ - 'mock', - 'pillow', - ] - - self.pip_install_args = [ - mock.ANY, # python path - '-m', - 'pip', - 'install', - '--upgrade', - '--no-cache-dir', - ] - - def assertArgsStartsWith(self, args, call): - """ - Assert that each element of args of the mock start - with each element of args. - """ - args_mock, _ = call - for arg, arg_mock in zip_longest(args, args_mock): - if arg is not mock.ANY: - self.assertIsNotNone(arg_mock) - self.assertTrue(arg_mock.startswith(arg), arg) - - @patch('readthedocs.projects.models.Project.checkout_path') - def test_install_core_requirements_sphinx(self, checkout_path): - tmpdir = tempfile.mkdtemp() - checkout_path.return_value = tmpdir - python_env = Virtualenv( - version=self.version_sphinx, - build_env=self.build_env_mock, - ) - python_env.install_core_requirements() - requirements_sphinx = [ - "commonmark", - "recommonmark", - "sphinx", - "sphinx-rtd-theme", - "readthedocs-sphinx-ext", - ] - - self.assertEqual(self.build_env_mock.run.call_count, 2) - calls = self.build_env_mock.run.call_args_list - - core_args = self.pip_install_args + ["pip", "setuptools"] - self.assertArgsStartsWith(core_args, calls[0]) - - requirements = self.base_requirements + requirements_sphinx - args = self.pip_install_args + requirements - self.assertArgsStartsWith(args, calls[1]) - - @patch('readthedocs.projects.models.Project.checkout_path') - def test_install_core_requirements_mkdocs(self, checkout_path): - tmpdir = tempfile.mkdtemp() - checkout_path.return_value = tmpdir - python_env = Virtualenv( - version=self.version_mkdocs, - build_env=self.build_env_mock, - ) - python_env.install_core_requirements() - requirements_mkdocs = [ - 'commonmark', - 'recommonmark', - 'mkdocs', - ] - - self.assertEqual(self.build_env_mock.run.call_count, 2) - calls = self.build_env_mock.run.call_args_list - - core_args = self.pip_install_args + ["pip", "setuptools"] - self.assertArgsStartsWith(core_args, calls[0]) - - requirements = self.base_requirements + requirements_mkdocs - args = self.pip_install_args + requirements - self.assertArgsStartsWith(args, calls[1]) - - @patch('readthedocs.projects.models.Project.checkout_path') - def test_install_user_requirements(self, checkout_path): - """ - If a projects does not specify a requirements file, - RTD will choose one automatically. - - First by searching under the docs/ directory and then under the root. - The files can be named as: - - - ``pip_requirements.txt`` - - ``requirements.txt`` - """ - tmpdir = tempfile.mkdtemp() - checkout_path.return_value = tmpdir - self.build_env_mock.project = self.project_sphinx - self.build_env_mock.version = self.version_sphinx - python_env = Virtualenv( - version=self.version_sphinx, - build_env=self.build_env_mock, - ) - - checkout_path = python_env.checkout_path - docs_requirements = os.path.join( - checkout_path, 'docs', 'requirements.txt', - ) - root_requirements = os.path.join( - checkout_path, 'requirements.txt', - ) - paths = { - os.path.join(checkout_path, 'docs'): True, - } - args = [ - mock.ANY, # python path - '-m', - 'pip', - 'install', - '--exists-action=w', - '--no-cache-dir', - '-r', - 'requirements_file', - ] - - # One requirements file on the docs/ dir - # should be installed - paths[docs_requirements] = True - paths[root_requirements] = False - with fake_paths_lookup(paths): - python_env.install_requirements() - args[-1] = 'docs/requirements.txt' - self.build_env_mock.run.assert_called_with( - *args, cwd=mock.ANY, bin_path=mock.ANY - ) - - # One requirements file on the root dir - # should be installed - paths[docs_requirements] = False - paths[root_requirements] = True - with fake_paths_lookup(paths): - python_env.install_requirements() - args[-1] = 'requirements.txt' - self.build_env_mock.run.assert_called_with( - *args, cwd=mock.ANY, bin_path=mock.ANY - ) - - # Two requirements files on the root and docs/ dirs - # the one on docs/ should be installed - paths[docs_requirements] = True - paths[root_requirements] = True - with fake_paths_lookup(paths): - python_env.install_requirements() - args[-1] = 'docs/requirements.txt' - self.build_env_mock.run.assert_called_with( - *args, cwd=mock.ANY, bin_path=mock.ANY - ) - - # No requirements file - # no requirements should be installed - self.build_env_mock.run.reset_mock() - paths[docs_requirements] = False - paths[root_requirements] = False - with fake_paths_lookup(paths): - python_env.install_requirements() - self.build_env_mock.run.assert_not_called() - - @patch('readthedocs.projects.models.Project.checkout_path') - def test_install_core_requirements_sphinx_conda(self, checkout_path): - tmpdir = tempfile.mkdtemp() - checkout_path.return_value = tmpdir - python_env = Conda( - version=self.version_sphinx, - build_env=self.build_env_mock, - ) - python_env.install_core_requirements() - conda_sphinx = [ - 'sphinx', - 'sphinx_rtd_theme', - ] - conda_requirements = self.base_conda_requirements + conda_sphinx - pip_requirements = [ - 'recommonmark', - 'readthedocs-sphinx-ext', - ] - - args_pip = [ - mock.ANY, # python path - '-m', - 'pip', - 'install', - '-U', - '--no-cache-dir', - ] - args_pip.extend(pip_requirements) - - args_conda = [ - 'conda', - 'install', - '--yes', - '--quiet', - '--name', - self.version_sphinx.slug, - ] - args_conda.extend(conda_requirements) - - self.build_env_mock.run.assert_has_calls([ - mock.call(*args_conda, cwd=mock.ANY), - mock.call(*args_pip, bin_path=mock.ANY, cwd=mock.ANY), - ]) - - @patch('readthedocs.projects.models.Project.checkout_path') - def test_install_core_requirements_mkdocs_conda(self, checkout_path): - tmpdir = tempfile.mkdtemp() - checkout_path.return_value = tmpdir - python_env = Conda( - version=self.version_mkdocs, - build_env=self.build_env_mock, - ) - python_env.install_core_requirements() - conda_requirements = self.base_conda_requirements - pip_requirements = [ - 'recommonmark', - 'mkdocs', - ] - - args_pip = [ - mock.ANY, # python path - '-m', - 'pip', - 'install', - '-U', - '--no-cache-dir', - ] - args_pip.extend(pip_requirements) - - args_conda = [ - 'conda', - 'install', - '--yes', - '--quiet', - '--name', - self.version_mkdocs.slug, - ] - args_conda.extend(conda_requirements) - - self.build_env_mock.run.assert_has_calls([ - mock.call(*args_conda, cwd=mock.ANY), - mock.call(*args_pip, bin_path=mock.ANY, cwd=mock.ANY), - ]) - - @patch('readthedocs.projects.models.Project.checkout_path') - def test_install_user_requirements_conda(self, checkout_path): - tmpdir = tempfile.mkdtemp() - checkout_path.return_value = tmpdir - python_env = Conda( - version=self.version_sphinx, - build_env=self.build_env_mock, - ) - python_env.install_requirements() - self.build_env_mock.run.assert_not_called() diff --git a/readthedocs/settings/base.py b/readthedocs/settings/base.py index c8dfa7f74c6..90bc763f4aa 100644 --- a/readthedocs/settings/base.py +++ b/readthedocs/settings/base.py @@ -553,20 +553,6 @@ def TEMPLATES(self): 'schedule': crontab(minute=0, hour=4), 'options': {'queue': 'web'}, }, - # We keep having celery send multiple emails, - # which is a terrible UX, - # so let's remove them for now. - - # 'weekly-config-file-notification': { - # 'task': 'readthedocs.projects.tasks.utils.deprecated_config_file_used_notification', - # 'schedule': crontab(day_of_week='wed', hour=11, minute=15), - # 'options': {'queue': 'web'}, - # }, - # 'weekly-build-image-notification': { - # 'task': 'readthedocs.projects.tasks.utils.deprecated_build_image_notification', - # 'schedule': crontab(day_of_week='wed', hour=9, minute=15), - # 'options': {'queue': 'web'}, - # }, } # Sentry @@ -589,64 +575,9 @@ def TEMPLATES(self): RTD_DOCKER_COMPOSE = False DOCKER_VERSION = 'auto' - DOCKER_DEFAULT_VERSION = 'latest' + DOCKER_DEFAULT_VERSION = 'ubuntu-22.04' DOCKER_IMAGE = '{}:{}'.format(constants_docker.DOCKER_DEFAULT_IMAGE, DOCKER_DEFAULT_VERSION) - DOCKER_IMAGE_SETTINGS = { - # A large number of users still have this pinned in their config file. - # We must have documented it at some point. - 'readthedocs/build:2.0': { - 'python': { - 'supported_versions': ['2', '2.7', '3', '3.5'], - 'default_version': { - '2': '2.7', - '3': '3.5', - }, - }, - }, - 'readthedocs/build:4.0': { - 'python': { - 'supported_versions': ['2', '2.7', '3', '3.5', '3.6', 3.7], - 'default_version': { - '2': '2.7', - '3': '3.7', - }, - }, - }, - 'readthedocs/build:5.0': { - 'python': { - 'supported_versions': ['2', '2.7', '3', '3.5', '3.6', '3.7'], - 'default_version': { - '2': '2.7', - '3': '3.7', - }, - }, - }, - 'readthedocs/build:6.0': { - 'python': { - 'supported_versions': ['2', '2.7', '3', '3.5', '3.6', '3.7', '3.8'], - 'default_version': { - '2': '2.7', - '3': '3.7', - }, - }, - }, - 'readthedocs/build:7.0': { - 'python': { - 'supported_versions': ['2', '2.7', '3', '3.5', '3.6', '3.7', '3.8', '3.9', '3.10'], - 'default_version': { - '2': '2.7', - '3': '3.7', - }, - }, - }, - } - # Alias tagged via ``docker tag`` on the build servers - DOCKER_IMAGE_SETTINGS.update({ - 'readthedocs/build:stable': DOCKER_IMAGE_SETTINGS.get('readthedocs/build:5.0'), - 'readthedocs/build:latest': DOCKER_IMAGE_SETTINGS.get('readthedocs/build:6.0'), - 'readthedocs/build:testing': DOCKER_IMAGE_SETTINGS.get('readthedocs/build:7.0'), - }) # Additional binds for the build container RTD_DOCKER_ADDITIONAL_BINDS = {} RTD_DOCKER_BUILD_SETTINGS = constants_docker.RTD_DOCKER_BUILD_SETTINGS diff --git a/readthedocs/telemetry/tests/test_collectors.py b/readthedocs/telemetry/tests/test_collectors.py index b8f9964d1a1..755f7f02766 100644 --- a/readthedocs/telemetry/tests/test_collectors.py +++ b/readthedocs/telemetry/tests/test_collectors.py @@ -5,7 +5,7 @@ from django.test import TestCase from django_dynamic_fixture import get -from readthedocs.config import BuildConfigV2 +from readthedocs.config.tests.test_config import get_build_config from readthedocs.doc_builder.environments import DockerBuildEnvironment from readthedocs.projects.models import Project from readthedocs.telemetry.collectors import BuildDataCollector @@ -17,24 +17,19 @@ def setUp(self): self.user = get(User) self.project = get(Project, slug="test", users=[self.user]) self.version = self.project.versions.first() + + config = get_build_config({}) + config.validate() + self.environment = DockerBuildEnvironment( version=self.version, project=self.project, build={"id": 1}, - config=self._get_build_config({}), + config=config, api_client=mock.MagicMock(), ) self.collector = BuildDataCollector(self.environment) - def _get_build_config(self, config, env_config=None): - config = BuildConfigV2( - env_config=env_config or {}, - raw_config=config, - source_file="readthedocs.yaml", - ) - config.validate() - return config - def test_get_operating_system(self, run): run.return_value = (0, "Description:\tUbuntu 20.04.3 LTS", "") out = self.collector._get_operating_system() @@ -90,9 +85,10 @@ def test_get_all_conda_packages(self, run): ) def test_get_user_pip_packages(self, run): - self.collector.config = self._get_build_config( + self.collector.config = get_build_config( {"python": {"install": [{"requirements": "docs/requirements.txt"}]}} ) + self.collector.config.validate() out = dedent( """ requests-mock==1.8.0 @@ -186,9 +182,18 @@ def test_get_all_apt_packages(self, run): ) def test_get_user_apt_packages(self, run): - self.collector.config = self._get_build_config( - {"build": {"apt_packages": ["cmake", "libclang"]}} + self.collector.config = get_build_config( + { + "build": { + "os": "ubuntu-22.04", + "tools": { + "python": "3", + }, + "apt_packages": ["cmake", "libclang"], + } + } ) + self.collector.config.validate() self.assertEqual( self.collector._get_user_apt_packages(), [ diff --git a/readthedocs/templates/projects/project_advanced_settings_helptext.html b/readthedocs/templates/projects/project_advanced_settings_helptext.html index e254f936c4c..097a66b9395 100644 --- a/readthedocs/templates/projects/project_advanced_settings_helptext.html +++ b/readthedocs/templates/projects/project_advanced_settings_helptext.html @@ -1,9 +1,9 @@ {% load i18n %}

{% blocktrans trimmed with deprecation_link="https://blog.readthedocs.com/migrate-configuration-v2/" %} - Usage of the below settings is deprecated and support for these fields will be removed on September 25th, 2023. + Usage of the below settings is deprecated and support for these fields was removed on September 25th, 2023. + Their values are shown here in read-only to allow you to migrate to the YAML config file if you haven't already.

- For more information, see our blog post: - migrating your configuration to .readthedocs.yaml. + Read our blog post Migrate your project to .readthedocs.yaml configuration file v2 to learn how to perform the migration. {% endblocktrans %}