Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test failures #2

Open
sevro opened this issue Jun 25, 2021 · 1 comment
Open

Test failures #2

sevro opened this issue Jun 25, 2021 · 1 comment
Labels
bug Something isn't working

Comments

@sevro
Copy link
Collaborator

sevro commented Jun 25, 2021

A number of dataset, dataloader, and wrappers tests fail.

Complete steps to reproduce the bug

poetry run pytest

Expected behavior

All tests pass.

Environment

  • OS: Ubuntu 20.04
  • Library Version: fork torchmeta 1.7 before changes, and dev branch
  • Using the poetry environment
  • The lock file has not been changed on dev

Additional context

Test output:

============================= test session starts ==============================
platform linux -- Python 3.8.3, pytest-5.4.3, py-1.10.0, pluggy-0.13.1
rootdir: /home/datenstrom/workspace/src/github.com/sevro/torchmetal
collected 275 items

tests/test_torchmetal.py .                                               [  0%]
tests/datasets/test_datasets_helpers.py ..F.......F.......F.......F..... [ 12%]
..F.......F.....                                                         [ 17%]
tests/modules/test_activation.py ........................                [ 26%]
tests/modules/test_container.py ..                                       [ 27%]
tests/modules/test_conv.py ............                                  [ 31%]
tests/modules/test_linear.py ........                                    [ 34%]
tests/modules/test_module.py ..                                          [ 35%]
tests/modules/test_parallel.py ..........                                [ 38%]
tests/modules/test_sparse.py ........                                    [ 41%]
tests/toy/test_toy.py ...........                                        [ 45%]
tests/transforms/test_splitters.py ..                                    [ 46%]
tests/utils/test_dataloaders.py ......F.......F.......F.......F.......F. [ 61%]
......F......                                                            [ 65%]
tests/utils/test_gradient_based.py ...                                   [ 66%]
tests/utils/test_matching.py ....                                        [ 68%]
tests/utils/test_prototype.py ...                                        [ 69%]
tests/utils/test_r2d2.py ............................................... [ 86%]
.............                                                            [ 91%]
tests/utils/test_wrappers.py ..F.......F.......F.....                    [100%]

=================================== FAILURES ===================================
________________ test_datasets_helpers[train-1-tieredimagenet] _________________

name = 'tieredimagenet', shots = 1, split = 'train'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/datasets/test_datasets_helpers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
________________ test_datasets_helpers[train-5-tieredimagenet] _________________

name = 'tieredimagenet', shots = 5, split = 'train'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/datasets/test_datasets_helpers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
_________________ test_datasets_helpers[val-1-tieredimagenet] __________________

name = 'tieredimagenet', shots = 1, split = 'val'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/datasets/test_datasets_helpers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
_________________ test_datasets_helpers[val-5-tieredimagenet] __________________

name = 'tieredimagenet', shots = 5, split = 'val'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/datasets/test_datasets_helpers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
_________________ test_datasets_helpers[test-1-tieredimagenet] _________________

name = 'tieredimagenet', shots = 1, split = 'test'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/datasets/test_datasets_helpers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
_________________ test_datasets_helpers[test-5-tieredimagenet] _________________

name = 'tieredimagenet', shots = 5, split = 'test'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/datasets/test_datasets_helpers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
___________ test_datasets_helpers_dataloader[train-1-tieredimagenet] ___________

name = 'tieredimagenet', shots = 1, split = 'train'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_dataloader(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_dataloaders.py:93: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
___________ test_datasets_helpers_dataloader[train-5-tieredimagenet] ___________

name = 'tieredimagenet', shots = 5, split = 'train'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_dataloader(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_dataloaders.py:93: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
____________ test_datasets_helpers_dataloader[val-1-tieredimagenet] ____________

name = 'tieredimagenet', shots = 1, split = 'val'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_dataloader(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_dataloaders.py:93: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
____________ test_datasets_helpers_dataloader[val-5-tieredimagenet] ____________

name = 'tieredimagenet', shots = 5, split = 'val'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_dataloader(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_dataloaders.py:93: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
___________ test_datasets_helpers_dataloader[test-1-tieredimagenet] ____________

name = 'tieredimagenet', shots = 1, split = 'test'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_dataloader(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_dataloaders.py:93: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
___________ test_datasets_helpers_dataloader[test-5-tieredimagenet] ____________

name = 'tieredimagenet', shots = 5, split = 'test'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('shots', [1, 5])
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_dataloader(name, shots, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=shots,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_dataloaders.py:93: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
_____________ test_datasets_helpers_wrapper[train-tieredimagenet] ______________

name = 'tieredimagenet', split = 'train'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_wrapper(name, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=1,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_wrappers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
______________ test_datasets_helpers_wrapper[val-tieredimagenet] _______________

name = 'tieredimagenet', split = 'val'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_wrapper(name, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=1,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_wrappers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
______________ test_datasets_helpers_wrapper[test-tieredimagenet] ______________

name = 'tieredimagenet', split = 'test'

    @pytest.mark.skipif(not is_local, reason='Requires datasets downloaded locally')
    @pytest.mark.parametrize('name', helpers.__all__)
    @pytest.mark.parametrize('split', ['train', 'val', 'test'])
    def test_datasets_helpers_wrapper(name, split):
        function = getattr(helpers, name)
        folder = os.getenv('TORCHMETAL_DATA_FOLDER')
        download = bool(os.getenv('TORCHMETAL_DOWNLOAD', False))
    
>       dataset = function(folder,
                           ways=5,
                           shots=1,
                           test_shots=15,
                           meta_split=split,
                           download=download)

tests/utils/test_wrappers.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
torchmetal/datasets/helpers.py:168: in tieredimagenet
    return helper_with_default(TieredImagenet, folder, shots, ways,
torchmetal/datasets/helpers.py:35: in helper_with_default
    dataset = klass(folder, num_classes_per_task=ways, **kwargs)
torchmetal/datasets/tieredimagenet.py:89: in __init__
    dataset = TieredImagenetClassDataset(root, meta_train=meta_train,
torchmetal/datasets/tieredimagenet.py:128: in __init__
    self.download()
torchmetal/datasets/tieredimagenet.py:181: in download
    download_file_from_google_drive(self.gdrive_id, self.root,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

file_id = '1g1aIDy2Ar_MViF2gDXFYDBTR-HYecV07'
root = '/home/datenstrom/workspace_ssd/datasets/torchmetal/tieredimagenet'
filename = 'tiered-imagenet.tar', md5 = 'e07e811b9f29362d159a9edd0d838c62'

    def download_file_from_google_drive(file_id, root, filename=None, md5=None):
        """Download a Google Drive file from  and place it in root.
    
        Args:
            file_id (str): id of file to be downloaded
            root (str): Directory to place downloaded file in
            filename (str, optional): Name to save the file under. If None, use the id of the file.
            md5 (str, optional): MD5 checksum of the download. If None, do not check
        """
        # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url
        import requests
        url = "https://docs.google.com/uc?export=download"
    
        root = os.path.expanduser(root)
        if not filename:
            filename = file_id
        fpath = os.path.join(root, filename)
    
        os.makedirs(root, exist_ok=True)
    
>       if os.path.isfile(fpath) and check_integrity(fpath, md5):
E       NameError: name 'check_integrity' is not defined

torchmetal/datasets/utils.py:67: NameError
=============================== warnings summary ===============================
tests/modules/test_parallel.py::test_dataparallel_params_maml[model0]
  /home/datenstrom/.cache/pypoetry/virtualenvs/torchmetal-hZltVvxe-py3.8/lib/python3.8/site-packages/torch/cuda/nccl.py:48: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.9 it will stop working
    if not isinstance(inputs, collections.Container) or isinstance(inputs, torch.Tensor):

tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[train-1-omniglot]
tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[train-5-omniglot]
tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[val-1-omniglot]
tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[val-5-omniglot]
tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[test-1-omniglot]
tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[test-5-omniglot]
tests/utils/test_dataloaders.py::test_overflow_length_dataloader
  /home/datenstrom/.cache/pypoetry/virtualenvs/torchmetal-hZltVvxe-py3.8/lib/python3.8/site-packages/torchvision/transforms/functional.py:942: UserWarning: Argument interpolation should be of type InterpolationMode instead of int. Please, use InterpolationMode enum.
    warnings.warn(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
=========================== short test summary info ============================
FAILED tests/datasets/test_datasets_helpers.py::test_datasets_helpers[train-1-tieredimagenet]
FAILED tests/datasets/test_datasets_helpers.py::test_datasets_helpers[train-5-tieredimagenet]
FAILED tests/datasets/test_datasets_helpers.py::test_datasets_helpers[val-1-tieredimagenet]
FAILED tests/datasets/test_datasets_helpers.py::test_datasets_helpers[val-5-tieredimagenet]
FAILED tests/datasets/test_datasets_helpers.py::test_datasets_helpers[test-1-tieredimagenet]
FAILED tests/datasets/test_datasets_helpers.py::test_datasets_helpers[test-5-tieredimagenet]
FAILED tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[train-1-tieredimagenet]
FAILED tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[train-5-tieredimagenet]
FAILED tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[val-1-tieredimagenet]
FAILED tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[val-5-tieredimagenet]
FAILED tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[test-1-tieredimagenet]
FAILED tests/utils/test_dataloaders.py::test_datasets_helpers_dataloader[test-5-tieredimagenet]
FAILED tests/utils/test_wrappers.py::test_datasets_helpers_wrapper[train-tieredimagenet]
FAILED tests/utils/test_wrappers.py::test_datasets_helpers_wrapper[val-tieredimagenet]
FAILED tests/utils/test_wrappers.py::test_datasets_helpers_wrapper[test-tieredimagenet]
================= 15 failed, 260 passed, 8 warnings in 15.24s ==================
@sevro sevro added the bug Something isn't working label Jun 25, 2021
@sevro
Copy link
Collaborator Author

sevro commented Jun 25, 2021

The problem originates in torchmetal/datasets/utils.py and is related to a comment in that file which should be an issue.

QKFIX: The current version of download_file_from_google_drive (as of torchvision==0.8.1)
is inconsistent, and a temporary fix has been added to the bleeding-edge version of
Torchvision. The temporary fix removes the behaviour of _quota_exceeded, whenever the
quota has exceeded for the file to be downloaded. As a consequence, this means that there
is currently no protection against exceeded quotas. If you get an integrity error in torchmetal
(e.g. "MiniImagenet integrity check failed" for MiniImagenet), then this means that the quota
has exceeded for this dataset. See also: tristandeleu/pytorch-meta#54

See also: pytorch/vision#2992

The following functions are taken from
https://github.com/pytorch/vision/blob/cd0268cd408d19d91f870e36fdffd031085abe13/torchvision/datasets/utils.py

Most likely the failed tests are not an issue just this file has been over quota recently. It also seems that Torchvision has merged a fix (pytorch/vision#4109) which could be incorporated that would give a better error for users.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant