Open Access
ITM Web Conf.
Volume 40, 2021
International Conference on Automation, Computing and Communication 2021 (ICACC-2021)
Article Number 03048
Number of page(s) 6
Section Computing
Published online 09 August 2021
  1. P. Prajapati, A. Thakkar, and A. Ganatra, “A survey and current research challenges in multi-label classification methods,” International Journal of Soft Computing and Engineering (IJSCE), vol. 2, no. 1, pp. 248–252, 2012. [Google Scholar]
  2. M. Ivasic-Kos, M. Pobar, and I. Ipsic, “Automatic movie posters classification into genres,” in International Conference on ICT Innovations, pp. 319–328, Springer, 2014. [Google Scholar]
  3. W.-T. Chu and H.-J. Guo, “Movie genre classification based on poster images with deep neural networks,” in Proceedings of the Workshop on Multimodal Understanding of Social, Affective and Subjective Attributes, pp. 39–45, 2017. [Google Scholar]
  4. J. Wehrmann and R. C. Barros, “Movie genre classification: A multi-label approach based on convolutions through time,” Applied Soft Computing, vol. 61, pp. 973–982, 2017. [CrossRef] [Google Scholar]
  5. S. Sung and R. Chokshi, “Classification of movie posters to movie genres,” [Google Scholar]
  6. G. Barney and K. Kaya, “Predicting genre from movie posters,” 2019. [Google Scholar]
  7. J. A. Wi, S. Jang, and Y. Kim, “Poster-based multiple movie genre classification using inter-channel features,” IEEE Access, vol. 8, pp. 66615–66624, 2020. [CrossRef] [Google Scholar]
  8. A. Khan, A. Sohail, U. Zahoora, and A. S. Qureshi, “A survey of the recent architectures of deep convolutional neural networks,” Artificial Intelligence Review, vol. 53, no. 8, pp. 5455–5516, 2020. [Google Scholar]
  9. E. Ben-Baruch, T. Ridnik, N. Zamir, A. Noy, I. Friedman, M. Protter, and L. Zelnik-Manor, “Asymmetric loss for multi-label classification,” arXiv preprint arXiv:2009.14119, 2020. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.