Attention is All Large Language Model NeedYuxin LiuITM Web Conf., 73 (2025) 02025DOI: https://doi.org/10.1051/itmconf/20257302025