Skip to content

The Pytorch implemetation of "FeatWalk: Enhancing Few-Shot Classification through Local View Leveraging", AAAI 2024.

Notifications You must be signed in to change notification settings

exceefind/FeatWalk

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FeatWalk

FeatWalk is a method tailored for few-shot learning settings, focusing on effectively mining local views to mitigate the interference caused by discriminative features in global view pre-training. By analyzing the correlation of local views with different class prototypes, FeatWalk constructs a more comprehensive class-related representation. This method has been accepted by AAAI 2024, and this repository serves as the official implementation for reference.

Comparison with Baseline Methods

The following table demonstrates the performance of FeatWalk compared to the baseline method DeepBDC in various few-shot learning (FSL) scenarios on MiniImageNet and TieredImageNet. The results indicate that FeatWalk significantly outperforms DeepBDC in different FSL scenarios.

Method Embedding Mini
5-way 1-shot
Mini
5-way 5-shot
Tiered
5-way 1-shot
Tiered
5-way 5-shot
DeepBDC BDC 67.83 ± 0.43 85.45 ± 0.29 73.82 ± 0.47 89.00 ± 0.30
FeatWalk BDC 70.21 ± 0.44 87.38 ± 0.27 75.25 ± 0.48 89.92 ± 0.29

Preparation Before Running

Before starting with FeatWalk, please ensure the following preparations are made:

  1. Place the pre-trained models in the checkpoint directory. The pre-trained models can be obtained through the corresponding baseline methods or accessed from the official DeepBDC implementation.
  2. Ensure that datasets (such as MiniImageNet) are located in the filelist directory.

Dataset Structure:

--FeatWalk
    |--filelist
        |--miniImageNet
            |--train
            |--val
            |--test

Running Commands

To run FeatWalk, use the following command:

# 5-Way 1-shot/5-shot on MiniImageNet
sh run.sh

Acknowledgments

We would like to express our heartfelt gratitude to the open-source methods GoodEmbed and DeepBDC. Our code for this paper was inspired and informed by these sources, and their contributions have been invaluable in supporting our work.

About

The Pytorch implemetation of "FeatWalk: Enhancing Few-Shot Classification through Local View Leveraging", AAAI 2024.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published