-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
关于PCA替换self-attention #53
Comments
Utilize torch.pca_lowrank to perform the PCA operation, and set some
eigenvalues to zero to isolate the primary signal.
Tian
…On Sun, Dec 29, 2024 at 8:14 PM YangyangGuo ***@***.***> wrote:
To verify this idea, we further conduct experiments where we replace the
self-attention
module with PCA and find token similarity patterns remain unchanged
according to Figure 4 (b), 大佬,我想问问怎么replace self-attention with
PCA呢,代码中有这部分嘛
—
Reply to this email directly, view it on GitHub
<#53>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AB3JGO2WLS5HM4YHD5M2BB32H7RRDAVCNFSM6AAAAABUKXZFJOVHI2DSMVQWIX3LMV43ASLTON2WKOZSG43DEMRSGYZDEMI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thank you for your reply. 不太明白,但是我再研究研究,谢谢您的回复! |
To verify this idea, we further conduct experiments where we replace the self-attention
module with PCA and find token similarity patterns remain unchanged according to Figure 4 (b), 大佬,我想问问怎么replace self-attention with PCA呢,代码中有这部分嘛
The text was updated successfully, but these errors were encountered: