Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于PCA替换self-attention #53

Open
yyg1282142265 opened this issue Dec 29, 2024 · 2 comments
Open

关于PCA替换self-attention #53

yyg1282142265 opened this issue Dec 29, 2024 · 2 comments

Comments

@yyg1282142265
Copy link

To verify this idea, we further conduct experiments where we replace the self-attention
module with PCA and find token similarity patterns remain unchanged according to Figure 4 (b), 大佬,我想问问怎么replace self-attention with PCA呢,代码中有这部分嘛

@tianzhou2011
Copy link
Contributor

tianzhou2011 commented Dec 29, 2024 via email

@yyg1282142265
Copy link
Author

Utilize torch.pca_lowrank to perform the PCA operation, and set some eigenvalues to zero to isolate the primary signal. Tian

On Sun, Dec 29, 2024 at 8:14 PM YangyangGuo @.> wrote: To verify this idea, we further conduct experiments where we replace the self-attention module with PCA and find token similarity patterns remain unchanged according to Figure 4 (b), 大佬,我想问问怎么replace self-attention with PCA呢,代码中有这部分嘛 — Reply to this email directly, view it on GitHub <#53>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB3JGO2WLS5HM4YHD5M2BB32H7RRDAVCNFSM6AAAAABUKXZFJOVHI2DSMVQWIX3LMV43ASLTON2WKOZSG43DEMRSGYZDEMI . You are receiving this because you are subscribed to this thread.Message ID: @.>

Thank you for your reply. 不太明白,但是我再研究研究,谢谢您的回复!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants