This repo is the official implementation of experiments in our paper:
Universal Approximation with Softmax Attention
by
Jerry Yao-Chieh Hu*, Hude Liu*, Hong-Yu Chen*, Weimin Wu, Han Liu
The code was tested on Python 3.11
. To install, run pip install -r requirements.txt
.
Run truncated.py
.
Run attn_map.py
.
First generate data by running seq2seq_data.py
, then run seq2seq.py
.
If you have any question regarding our paper or codes, please feel free to start an issue or email Hong-Yu Chen ([email protected]). If you find our work useful, please kindly cite our paper:
@article{hu2025universal,
title={Universal Approximation with Softmax Attention},
author={Hu, Jerry Yao-Chieh and Liu, Hude and Chen, Hong-Yu and Wu, Weimin and Liu, Han},
journal={arXiv preprint arXiv:2504.15956},
year={2025}
}