Skip to content

Commit

Permalink
clean up
Browse files Browse the repository at this point in the history
  • Loading branch information
Kye committed Oct 8, 2023
1 parent cb0a761 commit 43d1aff
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 2 deletions.
Binary file modified .DS_Store
Binary file not shown.
16 changes: 15 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# Complex Transformer
# Complex Transformer (WIP)
The open source implementation of the attention and transformer from "Building Blocks for a Complex-Valued Transformer Architecture" where they propose an an attention mechanism for complex valued signals or images such as MRI and remote sensing.

They present:
Expand Down Expand Up @@ -43,5 +43,19 @@ print("Attention Output Shape:", attn_output.shape)

```

# Architecture
- I use regular norm instead of complex norm for simplicity

# License
MIT

# Citations
```
@article{2306.09827,
Author = {Florian Eilers and Xiaoyi Jiang},
Title = {Building Blocks for a Complex-Valued Transformer Architecture},
Year = {2023},
Eprint = {arXiv:2306.09827},
Doi = {10.1109/ICASSP49357.2023.10095349},
}
```
3 changes: 2 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
torch
einops
einops
zetascale

0 comments on commit 43d1aff

Please sign in to comment.