From 43d1afff43ded2e14661a6c91d64b73cfd7a86e9 Mon Sep 17 00:00:00 2001 From: Kye Date: Sun, 8 Oct 2023 10:27:42 -0400 Subject: [PATCH] clean up --- .DS_Store | Bin 6148 -> 6148 bytes README.md | 16 +++++++++++++++- requirements.txt | 3 ++- 3 files changed, 17 insertions(+), 2 deletions(-) diff --git a/.DS_Store b/.DS_Store index b1bc858e925d93fb375ed4bdb59c7154b9b72c92..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 100644 GIT binary patch delta 51 vcmZoMXfc?e&B4IH0LBv&MFg0D92j6^U=Y}txQu;b0oP;$5thx|96$L1$0-Qd delta 65 zcmZoMXfc?eJ=s8n#fXI=g(06InV~qPI5{UNKR<_&0SuTR6a$D0!~zU(`HhK-#R0_; B4P*cS diff --git a/README.md b/README.md index 189d1ea..2520fd1 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf) -# Complex Transformer +# Complex Transformer (WIP) The open source implementation of the attention and transformer from "Building Blocks for a Complex-Valued Transformer Architecture" where they propose an an attention mechanism for complex valued signals or images such as MRI and remote sensing. They present: @@ -43,5 +43,19 @@ print("Attention Output Shape:", attn_output.shape) ``` +# Architecture +- I use regular norm instead of complex norm for simplicity + # License MIT + +# Citations +``` +@article{2306.09827, +Author = {Florian Eilers and Xiaoyi Jiang}, +Title = {Building Blocks for a Complex-Valued Transformer Architecture}, +Year = {2023}, +Eprint = {arXiv:2306.09827}, +Doi = {10.1109/ICASSP49357.2023.10095349}, +} +``` diff --git a/requirements.txt b/requirements.txt index 8d887b0..50fce02 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,2 +1,3 @@ torch -einops \ No newline at end of file +einops +zetascale