| Issue |
ITM Web Conf.
Volume 78, 2025
International Conference on Computer Science and Electronic Information Technology (CSEIT 2025)
|
|
|---|---|---|
| Article Number | 04028 | |
| Number of page(s) | 12 | |
| Section | Foundations and Frontiers in Multimodal AI, Large Models, and Generative Technologies | |
| DOI | https://doi.org/10.1051/itmconf/20257804028 | |
| Published online | 08 September 2025 | |
Explore the Principles of Prompt Tuning and the Progress of Research
College of Computer Science and Technology, Jilin University, Changchun , Jilin, 130012, China
Prompt Tuning is a lightweight fine-tuning method that demonstrates efficient task adaptation and parameter efficiency for pre-trained language models (PLMs). Prompt Tuning highlights an important contribution to the advancement of NLP technology. The purpose of this paper is to explore the basic principles and methods of Prompt Tuning and to analyze the design features of discrete and continuous Prompts and their application performance in different scenarios. It is found that discrete Prompt performs well in interpretability and simple task adaptation, while continuous Prompt is more advantageous in complex tasks and cross-domain generalization; meanwhile, Prompt Tuning significantly improves the model performance in less sample scenarios, and the parameter efficiency is several times higher than that of traditional fine-tuning. However, the dependence of Prompt Tuning on Prompt design and the limitation of generalization to specific tasks still need to be further optimized. This paper hopes to provide an important reference for future theoretical research and practical applications of Prompt Tuning.
© The Authors, published by EDP Sciences, 2025
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.

