Open Access
Issue
ITM Web Conf.
Volume 53, 2023
2nd International Conference on Data Science and Intelligent Applications (ICDSIA-2023)
Article Number 02011
Number of page(s) 13
Section Machine Learning / Deep Learning
DOI https://doi.org/10.1051/itmconf/20235302011
Published online 01 June 2023
  1. E. P. Torres, E. A. Torres, M. Hernandez´ Alvarez, and S. G. Yoo, “Eeg-based ´ bci emotion recognition: A survey, ” Sensors, vol. 20, no. 18, p. 5083, (2020). [CrossRef] [Google Scholar]
  2. F. Shen, G. Dai, G. Lin, J. Zhang, W. Kong, and H. Zeng, “Eeg-based emotion recognition using 4d convolutional recurrent neural network, ” Cognitive Neurodynamics, vol. 14, no. 6, pp. 815–828, (2020). [CrossRef] [Google Scholar]
  3. R.-N. Duan, J.-Y. Zhu, and B.-L. Lu, “Differential entropy feature for eegbased emotion classification, ” in 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), IEEE, pp. 81–84, (2013). [Google Scholar]
  4. G. Xiao, M. Shi, M. Ye, B. Xu, Z. Chen, and Q. Ren, “4d attention-based neural network for eeg emotion recognition, ” Cognitive Neurodynamics, pp. 1–14, (2022). [Google Scholar]
  5. J. Li, Z. Zhang, and H. He, “Hierarchical convolutional neural networks for eeg-based emotion recognition, ” Cognitive Computation, vol. 10, no. 2, pp. 368–380, (2018). [CrossRef] [MathSciNet] [Google Scholar]
  6. Z. Jia, Y. Lin, X. Cai, H. Chen, H. Gou, and J. Wang, “Sst-emotionnet: Spatialspectral-temporal based attention 3d dense network for eeg emotion recognition, ” in Proceedings of the 28th ACM International Conference on Multimedia, pp. 2909–2917, (2020). [Google Scholar]
  7. W. Liu, J.-L. Qiu, W.-L. Zheng, and B.-L. Lu, “Comparing recognition performance and robustness of multimodal deep learning models for multimodal emotion recognition, ” IEEE Transactions on Cognitive and Developmental Systems, (2021). [Google Scholar]
  8. T. Song, W. Zheng, P. Song, and Z. Cui, “Eeg emotion recognition using dynamical graph convolutional neural networks, ” IEEE Transactions on Affective Computing, vol. 11, no. 3, pp. 532–541, (2018). [Google Scholar]
  9. N. S. Suhaimi, J. Mountstephens, and J. Teo, “Eeg-based emotion recognition: A stateof-the-art review of current trends and opportunities, ” Computational intelligence and neuroscience, vol. 2020, (2020). [Google Scholar]
  10. V. Jadhav, N. Tiwari, and M. Chawla, “A review on eeg data classification methods for brain–computer interface, ” in International Conference on Innovative Computing and Communications, Springer, pp. 747–760, (2023). [Google Scholar]
  11. W.-L. Zheng and B.-L. Lu, “Investigating critical frequency bands and channels for eeg-based emotion recognition with deep neural networks, ” IEEE Transactions on Autonomous Mental Development, vol. 7, no. 3, pp. 162–175, (2015). [CrossRef] [Google Scholar]
  12. W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, “Emotionmeter: A multimodal framework for recognizing human emotions, ” IEEE transactions on cybernetics, vol. 49, no. 3, pp. 1110–1122, (2018). [Google Scholar]
  13. F. Fahimi, Z. Zhang, W. B. Goh, T.-S. Lee, K. K. Ang, and C. Guan, “Inter-subject transfer learning with an end-to-end deep convolutional neural network for eeg-based bci, ” Journal of neural engineering, vol. 16, no. 2, p. 026007, (2019). [CrossRef] [Google Scholar]
  14. M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, “A multimodal database for affect recognition and implicit tagging, ” IEEE transactions on affective computing, vol. 3, no. 1, pp. 42–55, (2011). [Google Scholar]
  15. S. Katsigiannis and N. Ramzan, “Dreamer: A database for emotion recognition through eeg and ecg signals from wireless low-cost off-the-shelf devices, ” IEEE journal of biomedical and health informatics, vol. 22, no. 1, pp. 98–107, (2017). [Google Scholar]
  16. C. Szegedy, W. Liu, Y. Jia, et al., “Going deeper with convolutions, ” in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1–9, (2015). [Google Scholar]
  17. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition, ” in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 770–778, (2016). [Google Scholar]
  18. D. McNeely-White, J. R. Beveridge, and B. A. Draper, “Inception and resnet features are (almost) equivalent, ” Cognitive Systems Research, vol. 59, pp. 312–318, (2020). [CrossRef] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.