This repository focuses on high-order neural network architecture in computer vision.
We start with Attention-mechanisms, to Transformers, Element-wise Multiplication.
- Our works.
[ICLR2021] An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
[ICML2021] Training data-efficient image transformers & distillation through attention
[ICCV2021] Swin Transformer: Hierarchical Vision Transformer Using Shifted Windows
[ICCV2017] SORT: Second-Order Response Transform for Visual Recognition
-
[ASP-DAC2024] QuadraNet: Improving High-Order Neural Interaction Efficiency with Hardware-Aware Quadratic Neural Networks
-
[arxiv 2024] QuadraNet V2: Efficient and Sustainable Training of High-Order Neural Networks with Quadratic Adaptation
[TPAMI2021] Deep Polynomial Neural Networks
[CVPR2020] P-nets: Deep Polynomial Neural Networks
[ECCV2022] Augmenting Deep Classifiers with Polynomial Neural Networks
[ICLR2022] The Spectral Bias of Polynomial Neural Networks
[NeurIPS2022] Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a Polynomial Net Study
[NeurIPS2022] HorNet: Efficient High-Order Spatial Interactions with Recursive Gated Convolutions
[arxiv2022] Conv2Former: A Simple Transformer-Style ConvNet for Visual Recognition
[NeurIPS2022] Focal Modulation Networks
[CVPR2024] Rewrite the Stars
- [NeurIPS 2024] Infinite-dimensional Feature Interaction
[CVPR2019] Kervolutional Neural Networks
[ICCV2017] Factorized Bilinear Models for Image Recognition
[CVPR2019] Global Second-order Pooling Convolutional Networks
[TNNLS2021] Detachable Second-Order Pooling: Toward High-Performance First-Order Networks
[NeurIPS2019] Deep Multimodal Multilinear Fusion with High-order Polynomial Pooling