Skip to content

Commit 63d020b

Browse files
authored
Update README.md
1 parent e67980e commit 63d020b

File tree

1 file changed

+2
-7
lines changed

1 file changed

+2
-7
lines changed

README.md

+2-7
Original file line numberDiff line numberDiff line change
@@ -141,13 +141,8 @@ views of the funding agencies.
141141

142142
The experiments in this work benefit from the following open-source codes:
143143
* Ramsauer, Hubert, Bernhard Schäfl, Johannes Lehner, Philipp Seidl, Michael Widrich, Thomas Adler, Lukas Gruber et al. "Hopfield networks is all you need." arXiv preprint arXiv:2008.02217 (2020). https://github.com/ml-jku/hopfield-layers
144-
* Martins, Andre, and Ramon Astudillo. "From softmax to sparsemax: A sparse model of attention and multi-label classification." In International conference on machine learning, pp. 1614-1623. PMLR, 2016. [add sparsemax source code]
145-
* Correia, Gonçalo M., Vlad Niculae, and André FT Martins. "Adaptively sparse transformers." arXiv preprint arXiv:1909.00015 (2019). https://github.com/deep-spin/entmax
144+
* Martins, Andre, and Ramon Astudillo. "From softmax to sparsemax: A sparse model of attention and multi-label classification." In International conference on machine learning, pp. 1614-1623. PMLR, 2016. https://github.com/KrisKorrel/sparsemax-pytorch
145+
* Correia, Gonçalo M., Vlad Niculae, and André FT Martins. "Adaptively sparse transformers." arXiv preprint arXiv:1909.00015 (2019). https://github.com/deep-spin/entmax & https://github.com/prajjwal1/adaptive_transformer
146146
* Ilse, Maximilian, Jakub Tomczak, and Max Welling. "Attention-based deep multiple instance learning." In International conference on machine learning, pp. 2127-2136. PMLR, 2018. https://github.com/AMLab-Amsterdam/AttentionDeepMIL
147147
* Zhang, Yunhao, and Junchi Yan. "Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting." In The Eleventh International Conference on Learning Representations. 2022. https://github.com/Thinklab-SJTU/Crossformer
148148
* Millidge, Beren, Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, and Rafal Bogacz. "Universal hopfield networks: A general framework for single-shot associative memory models." In International Conference on Machine Learning, pp. 15561-15583. PMLR, 2022. https://github.com/BerenMillidge/Theory_Associative_Memory
149-
150-
### We appreciate the following works for their valuable code and data:
151-
152-
[https://github.com/KrisKorrel/sparsemax-pytorch/blob/master/sparsemax.py] <br>
153-
[https://github.com/prajjwal1/adaptive_transformer/blob/master/models/entmax.py] <br>

0 commit comments

Comments
 (0)