Error
  • The authentification system partialy failed, sorry for the inconvenience. Please try again later.
Open Access
Issue
ITM Web Conf.
Volume 78, 2025
International Conference on Computer Science and Electronic Information Technology (CSEIT 2025)
Article Number 04025
Number of page(s) 13
Section Foundations and Frontiers in Multimodal AI, Large Models, and Generative Technologies
DOI https://doi.org/10.1051/itmconf/20257804025
Published online 08 September 2025
  1. Chen, D., Zhao, H. ‘Data Security and Privacy Protection Issues in Cloud Computing’, 2012 International Conference on Computer Science and Electronics Engineering, pp. 1–7 (2012) [Google Scholar]
  2. Lu, Z., Pan, H., Dai, Y., Si, X., Zhang, Y. ‘Federated Learning With Non-IID Data A Survey’, IEEE Internet of Things Journal, 11(11), pp. 19188–19209 (2024) [Google Scholar]
  3. Gao, D., Yao, X., Yang, Q. ‘A Survey on Heterogeneous Federated Learning’, arXiv 2210.04505 (2022) [Google Scholar]
  4. Marfoq, O., Neglia, G., Bellet, A., Kameni, L., Vidal, R. ‘Federated Multi-Task Learning under a Mixture of Distributions’, Advances in Neural Information Processing Systems, pp. 1–7 (2021) [Google Scholar]
  5. Jiang, Y., Konečný, J., Rush, K., Kannan, S. ‘Improving Federated Learning Personalization via Model Agnostic Meta Learning’, arXiv 1909.12488 (2019) [Google Scholar]
  6. Fallah, A., Mokhtari, A., Ozdaglar, A. ‘Personalized Federated Learning A Meta-Learning Approach’, arXiv 2002.07948 (2020) [Google Scholar]
  7. Zhang, J., Ma, X., Wang, H., Xu, W., Wu, F. ‘Parameterized knowledge transfer for personalized federated learning’, Advances in Neural Information Processing Systems, pp. 1–7 (2021) [Google Scholar]
  8. Chen, H., Zeng, T., Xiao, K., Wu, S., Liu, X., Su, H. ‘Tailored Temperature for Student in Knowledge Distillation’, 2024 IEEE Smart World Congress (SWC), pp. 1–7 (2024) [Google Scholar]
  9. Matsuyama, K., Anjum, U., Matsuyama, S., Shoda, T., Zhan, J. ‘Adaptive Temperature Based on Logits Correlation in Knowledge Distillation’, arXiv 2503.09030 (2025) [Google Scholar]
  10. Wei, Y., Bai, Y. ‘Dynamic Temperature Knowledge Distillation’, arXiv 2404.12711 (2024) [Google Scholar]
  11. Li, D., Wang, J. ‘Fedmd Heterogenous federated learning via model distillation’, arXiv preprint arXiv 1910.03581 (2019) [Google Scholar]
  12. Meng, H., Zhang, J. ‘A novel multi-step-ahead approach for cloud server aging prediction based on hybrid deep learning model’, Engineering Applications of Artificial Intelligence, 133, pp. 108588 (2024) [Google Scholar]
  13. Lin, T., Kong, L., Stich, S.U., Jaggi, M. ‘Ensemble Distillation for Robust Model Fusion in Federated Learning’, Advances in Neural Information Processing Systems, pp. 1–7 (2020) [Google Scholar]
  14. Li, X., Huang, K., Yang, W., Wang, S., Zhang, Z. ‘On the Convergence of FedAvg on Non-IID Data’, arXiv 1907.02189 (2019) [Google Scholar]
  15. He, C., Annavaram, M., Avestimehr, S. ‘Group Knowledge Transfer Federated Learning of Large CNNs at the Edge’, Advances in Neural Information Processing Systems, pp. 1–7 (2020) [Google Scholar]
  16. Deng, Y., Ren, J., Tang, C., Lyu, F., Liu, Y., Zhang, Y. ‘A Hierarchical Knowledge Transfer Framework for Heterogeneous Federated Learning’, IEEE INFOCOM 2023 - IEEE Conference on Computer Communications, pp. 1–7 (2023) [Google Scholar]
  17. Chen, H., Vikalo, H. ‘The best of both worlds Accurate global and personalized models through federated learning with data-free hyper-knowledge distillation’, arXiv preprint arXiv 2301.08968 (2023) [Google Scholar]
  18. Le, Q., Diao, E., Wang, X., Khan, A.F., Tarokh, V., Ding, J., Anwar, A. ‘DynamicFL Federated Learning with Dynamic Communication Resource Allocation’, 2024 IEEE International Conference on Big Data (Big Data), pp. 1–7 (2024) [Google Scholar]
  19. Wu, C., Wu, F., Lyu, L., Huang, Y., Xie, X. ‘Communication-efficient federated learning via knowledge distillation’, Nat Commun, 13(1), pp. 2032 (2022) [Google Scholar]
  20. Tan, Y., Long, G., Liu, L., Zhou, T., Lu, Q., Jiang, J., Zhang, C. ‘Fedproto Federated prototype learning across heterogeneous clients’, Proceedings of the AAAI Conference on Artificial Intelligence, pp. 1–7 (2022) [Google Scholar]
  21. Wang, Z., Yan, F., Wang, T., Wang, C., Shu, Y., Cheng, P., Chen, J. ‘Fed-DFA Federated distillation for heterogeneous model fusion through the adversarial lens’, Proceedings of the AAAI Conference on Artificial Intelligence, pp. 1–7 (2025) [Google Scholar]
  22. Wang, Y., Wang, W., Wang, X., Zhang, H., Wu, X., & Yang, M. FedTweet Two-fold Knowledge Distillation for non-IID Federated Learning. Computers and Electrical Engineering, 114, 109067 (2024). [Google Scholar]
  23. Li, Z., Li, X., Yang, L., Zhao, B., Song, R., Luo, L., Li, J., Yang, J. ‘Curriculum Temperature for Knowledge Distillation’, Proceedings of the AAAI Conference on Artificial Intelligence, 37(2), pp. 1504–1512 (2023) [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.