Skip to content

This repo is for our ICLR2025 paper "F-Fidelity: A Robust Framework for Faithfulness Evaluation of Explainable AI"

Notifications You must be signed in to change notification settings

AslanDing/Finetune-Fidelity

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Update! We add a sample(easy to use) for image classifications. It is easy to adopt to other domain, such as Time Series.

image info

image info

Use F-Fidelity as Metric

  • If you just want to use our fidelity for evaluation, please refer to the example in the tools.

Reproduce (We provide the code for cifar100 reproduction)

  • Train and finetune the classification model. During finetuning, please use random deletion random augmenation with ratio $\beta$. We provide our solution for these task.
  • Obtain initial explanations, we provide the examples of CAMs and IGs. Generate various noise degraded explanations.
  • Evaluation
  • We also provide the results of cifar100 in google drive

Environments & Libraries

  • numpy, PIL, python-opencv, matplotlib, tqdm
  • pytorch, torchvision
  • captum
  • pytorch-grad-cam
@misc{zheng2024ffidelityrobustframeworkfaithfulness,
      title={F-Fidelity: A Robust Framework for Faithfulness Evaluation of Explainable AI}, 
      author={Xu Zheng and Farhad Shirani and Zhuomin Chen and Chaohao Lin and Wei Cheng and Wenbo Guo and Dongsheng Luo},
      year={2024},
      eprint={2410.02970},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2410.02970}, 
}

About

This repo is for our ICLR2025 paper "F-Fidelity: A Robust Framework for Faithfulness Evaluation of Explainable AI"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages