Prediction of Human Emotions toward Abstract Images by Image Features and Eye Tracking Device
Main Article Content
Abstract
- Currently, emotion semantic search technology can support users to access data in the database. This can cover user’s desirable which focuses on emotion concept. Given an image to different users, users’ emotion stimulated by the image might be different due to different areas of interest. This paper presents a novel approach to increase the accuracy of emotion based image classification by combining eye movement data with basic image feature. The results show that combining eye movement data together with color feature can yield better classification performance than using color feature alone.
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
I/we certify that I/we have participated sufficiently in the intellectual content, conception and design of this work or the analysis and interpretation of the data (when applicable), as well as the writing of the manuscript, to take public responsibility for it and have agreed to have my/our name listed as a contributor. I/we believe the manuscript represents valid work. Neither this manuscript nor one with substantially similar content under my/our authorship has been published or is being considered for publication elsewhere, except as described in the covering letter. I/we certify that all the data collected during the study is presented in this manuscript and no data from the study has been or will be published separately. I/we attest that, if requested by the editors, I/we will provide the data/information or will cooperate fully in obtaining and providing the data/information on which the manuscript is based, for examination by the editors or their assignees. Financial interests, direct or indirect, that exist or may be perceived to exist for individual contributors in connection with the content of this paper have been disclosed in the cover letter. Sources of outside support of the project are named in the cover letter.
I/We hereby transfer(s), assign(s), or otherwise convey(s) all copyright ownership, including any and all rights incidental thereto, exclusively to the Journal, in the event that such work is published by the Journal. The Journal shall own the work, including 1) copyright; 2) the right to grant permission to republish the article in whole or in part, with or without fee; 3) the right to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4) the right to republish the work in a collection of articles in any other mechanical or electronic format.
We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf.
All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.
References
2. J. Laaksonen, M. Koskela, and E. Oja "PicSOM Self-organizing Image Retrieval with MPEG-7 Content Descriptors", IEEE Transactions on Neural Networks, 13(4), pp. 841–853, 2002.
3. R. Datta, J. Li, and J. Z. Wang, “Content-based Image Retrieval – Approaches and Trends of the New Age”, In: Proceedings of ACM International Workshop on Multimedia Information Retrieval (MIR’2015), 10-11 Nov 2005, Singapore, pp. 253–262, 2005.
4. W. Weining, Y. Yinlin, and J. Shengming, “Image Retrieval by Emotional Semantics: A Study of Emotional Space and Feature Extraction”, In: Proceeding of IEEE International Conference on Systems, Man and Cybernetics (SMC’2006), 8-11 Oct 2006, Taipei, Taiwan, pp. 3534–3539, 2006.
5. J. Machajdik, and A. Hanbury, “Affective Image Classitication Using Features Inspired by Psychology and Art Theory,” In: Proceeding of ACM International Conference on Multimedia (MM’2010), 25-29 Oct 2010, Firenze, Italy, pp. 83–92, 2010.
6. H. Zhang, E. Augilius, T. Honkela, J. Laaksonen, H. Gamper, and H. Alene, “Analyzing Emotional Semantics of Abstract Art Using Low-level Image Features”, In: Proceeding of International Symposium on Intelligent Data Analysis (IDA’2011), 29-31 Oct 2011, Porto, Portugal, pp. 413–423, 2011.
7. H. Zhang, M. Gönen, Z. Yang and E. Oja, “Predicting Emotional States of Images Using Bayesian Multiple Kernel Learning”, In: Proceeding of International Conference on Neural Information Processing (ICONIP’2013), 3-7 Nov 2013, Daegu, Korea, pp. 274–-282, 2013.
8. K. Pasupa, C. J. Saunders, S. Szedmak, A. Klami, S. Kaski, and S. R. Gunn, “Learning to Rank Images from Eye movements”, In: Proceeding of 2009 IEEE 12th International Conference on Computer Vision (ICCV'2009) Workshops on Human-Computer Interaction (HCI'2009), 27 Sep-4 Oct 2009, Kyoto, Japan, pp. 2009–2016, 2009.
9. D. R. Hardoon and K. Pasupa “Image Ranking with Implicit Feedback from Eye Movements,” In: Proceedings of the 6th Biennial Symposium on Eye Tracking Research & Applications (ETRA'2010), 22-24 Mar 2010, Austin, USA, pp. 291–-298, 2010.
10. P. Auer, Z. Hussain, S. Kaski, A. Klami, J. Kujala, J. Laaksonen, A. P. Leung, K. Pasupa, and J. Shawe-Taylor “Pinview: Implicit Feedback in Content-Based Image Retrieval, In: Proceeding of Workshop on Applications of Pattern Analysis (WAPA'2010), 1-2 Sep 2010, Cumberland Lodge, UK, pp 51–57, 2010.
11. C. E. Izard “Basic Emotions, Relations Among Emotions, and Emotion-Cognition Relations”, Psychological Review, 99(3), pp. 561–565, 1992.
12. K. Vytal and S. Hamann “Neuroimaging Support for Discrete Neural Correlates of Basic Emotions: A Voxel-based Meta-analysis,” Cognitive Neuroscience, 22(12), pp. 2864–2885, 2010.
13. P. Ekman “Universals and Cultural Differences in Facial Expressions of Emotion,” Nebraska Symposium on Motivation, 19, pp. 207-282, 1972.
14. R. E. Jack, O. G.B. Garrod, and P. G. Schyns “Dynamic Facial Expressions of Emotion Transmit an Evolving Hierarchy of Signals Over Time”, Current Biology, 24(2), pp. 187–192, 2014.
15. R. Plutchik and H. Kellerman, “Emotion: Theory, Research and Experience,” Psychological Medicine, 11(1), pp. 207, 1980.
16. D. W. Galenson “Two Paths to Abstract Art Kandinsky and Malevich”, Technical Report, National Bureau of Economic Research, No. 12403, 2006.
17. Y. Wu, C. Bauckhage and C. Thurau “The Good, the Bad, and the Ugly: Predicting Aesthetic Image Labels”, In: Proceedings of 20th International Conference on Pattern Recognition (ICPR’2010). 23-26 Aug 2010, Istanbul, Turkey. pp. 1586–1589, 2010.
18. The Eye Tribe Aps, The Eye Tribe, Available at https://theeyetribe.com.
19. P. Shaver, J., Schwartz, D., Kirson, C., O'Connor “Emotional Knowledge: Further Exploration of a Prototype Approach,” In: Emotions in Social Psychology: Essential Readings, pp. 26-56, 2001.
20. E. C. Chang, S., Mallat, C., Yap “Wavelet Foveation,” Appl. Comput. Harmon. Anal., 9, pp. 312-335, 2000