AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Regular Paper

EmotionMap: Visual Analysis of Video Emotional Content on a Map

Cui-Xia Ma1,2,3Jian-Cheng Song1,2,3Qian Zhu1,2,3Kevin Maher1,2,4Ze-Yuan Huang1,2,3Hong-An Wang1,2,3
State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences, Beijing 100190, China
Beijing Key Laboratory of Human-Computer Interaction, Institute of Software Chinese Academy of Sciences, Beijing 100190, China
University of Chinese Academy of Sciences, Beijing 100190, China
Academy of Arts and Design, Tsinghua University, Beijing 100084, China
Show Author Information

Abstract

Emotion plays a crucial role in gratifying users’ needs during their experience of movies and TV series, and may be underutilized as a framework for exploring video content and analysis. In this paper, we present EmotionMap, a novel way of presenting emotion for daily users in 2D geography, fusing spatio-temporal information with emotional data. The interface is composed of novel visualization elements interconnected to facilitate video content exploration, understanding, and searching. EmotionMap allows understanding of the overall emotion at a glance while also giving a rapid understanding of the details. Firstly, we develop EmotionDisc which is an effective tool for collecting audiences’ emotion based on emotion representation models. We collect audience and character emotional data, and then integrate the metaphor of a map to visualize video content and emotion in a hierarchical structure. EmotionMap combines sketch interaction, providing a natural approach for users’ active exploration. The novelty and the effectiveness of EmotionMap have been demonstrated by the user study and experts’ feedback.

Electronic Supplementary Material

Download File(s)
jcst-35-3-576-Highlights.pdf (1.3 MB)

References

[1]

Bartsch A. Emotional gratification in entertainment experience. Why viewers of movies and television series find it rewarding to experience emotions. Media Psychology, 2012, 15(3): 267-302.

[2]
Zhang S, Tian Q, Huang Q et al. Utilizing affective analysis for efficient movie browsing. In Proc. the 16th IEEE International Conference on Image Processing, November 2009, pp.1853-1856.
[3]

Borgo R, Chen M, Daubney B et al. State of the art report on video-based graphics and video visualization. Computer Graphics Forum, 2012, 31(8): 2450-2477.

[4]

Meghdadi A H, Irani P. Interactive exploration of surveillance video through action shot summarization and trajectory visualization. IEEE Transactions on Visualization and Computer Graphics, 2013, 19(12): 2119-2128.

[5]

Stein M, Janetzko H, Lamprecht A et al. Bring it to the pitch: Combining video and movement data to enhance team sport analysis. IEEE Transactions on Visualization and Computer Graphics, 2017, 24(1): 13-22.

[6]

Tanahashi Y, Ma K L. Design considerations for optimizing storyline visualizations. IEEE Transactions on Visualization and Computer Graphics, 2012, 18(12): 2679-2688.

[7]

Ekman P. An argument for basic emotions. Cognition and Emotion, 1992, 6(3/4): 169-200.

[8]

Russell J A. A circumplex model of affect. Journal of Personality and Social Psychology, 1980, 39(6): 1161-1178.

[9]

Liang H, Liang R, Sun G. Looking into saliency model via space-time visualization. IEEE Transactions on Multimedia, 2016, 18(11): 2271-2281.

[10]
Zhang X, Dekel T, Xue T et al. MoSculp: Interactive visualization of shape and time. In Proc. the 31st Annual ACM Symposium on User Interface Software and Technology, October 2018, pp.275-285.
[11]

Bach B, Shi C, Heulot N et al. Time curves: Folding time to visualize patterns of temporal evolution in data. IEEE Transactions on Visualization and Computer Graphics, 2015, 22(1): 559-568.

[12]

Parry M L, Legg P A, Chung D H S et al. Hierarchical event selection for video storyboards with a case study on snooker video visualization. IEEE Transactions on Visualization and Computer Graphics, 2011, 17(12): 1747-1756.

[13]

Liu S, Wu Y, Wei E et al. StoryFlow: Tracking the evolution of stories. IEEE Transactions on Visualization and Computer Graphics, 2013, 19(12): 2436-2445.

[14]

Qiang L, Bingjie C, Haibo Z. Storytelling by the StoryCake visualization. The Visual Computer, 2017, 33(10): 1241-1252.

[15]

Tong C, Roberts R, Borgo R et al. Storytelling and visualization: An extended survey. Information, 2018, 9(3): Article No. 65.

[16]

Kim N W, Bach B, Im H et al. Visualizing nonlinear narratives with story curves. IEEE Transactions on Visualization and Computer Graphics, 2017, 24(1): 595-604.

[17]

Kurzhals K, John M, Heimerl F et al. Visual movie analytics. IEEE Transactions on Multimedia, 2016, 18(11): 2149-2160.

[18]

Pan Y, Niu Z, Wu J et al. InSocialNet: Interactive visual analytics for role-event videos. Computational Visual Media, 2019, 5(4): 375-390.

[19]

Hanjalic A, Xu L Q. Affective video content representation and modeling. IEEE Transactions on Multimedia, 2005, 7(1): 143-154.

[20]

Wang S, Ji Q. Video affective content analysis: A survey of state-of-the-art methods. IEEE Transactions on Affective Computing, 2015, 6(4): 410-430.

[21]
Jung H, Lee S, Yim J et al. Joint fine-tuning in deep neural networks for facial expression recognition. In Proc. the IEEE International Conference on Computer Vision, December 2015, pp.2983-2991.
[22]
Zhao S, Yao H, Jiang X et al. Predicting discrete probability distribution of image emotions. In Proc. the 2015 IEEE International Conference on Image Processing, Sept. 2015, pp.2459-2463.
[23]

Poria S, Cambria E, Bajpai R et al. A review of affective computing: From unimodal analysis to multimodal fusion. Information Fusion, 2017, 37: 98-125.

[24]

Zhalehpour S, Akhtar Z, Erdem C E. Multimodal emotion recognition based on peak frame selection from video. Signal, Image and Video Processing, 2016, 10(5): 827-834.

[25]
Zhao S, Yao H, Sun X et al. Flexible presentation of videos based on affective content analysis. In Proc. the 19th International Conference on Multimedia Modeling, January 2013, pp.368-379.
[26]
Lan Y, Wei S, Liu R et al. Creating video summarization from emotion perspective. In Proc. the 13th International Conference on Signal Processing, November 2016, pp.1112-1117.
[27]
Liu Z, Xu A, Guo Y et al. Seemo: A computational approach to see emotions. In Proc. the 2018 CHI Conference on Human Factors in Computing Systems, April 2018, Article No. 364.
[28]
Huang Y, Tang Y, Wang Y. Emotion map: A locationbased mobile social system for improving emotion awareness and regulation. In Proc. the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, March 2015, pp.130-142.
[29]

Cao N, Lin Y R, Gotz D. UnTangle map: Visual analysis of probabilistic multi-label data. IEEE Transactions on Visualization and Computer Graphics, 2015, 22(2): 1149-1163.

[30]
Chen S, Chen S, Lin L et al. E-map: A visual analytics approach for exploring significant event evolutions in social media. In Proc. the 2017 IEEE Conference on Visual Analytics Science and Technology, October 2017, pp.36-47.
[31]
Chen S, Chen S, Wang Z et al. D-map: Visual analysis of ego-centric information diffusion patterns in social media. In Proc. the 2016 IEEE Conference on Visual Analytics Science and Technology, October 2016, pp.41-50.
[32]
Watson M C. Time maps: A tool for visualizing many discrete events across multiple timescales. In Proc. the 2015 IEEE International Conference on Big Data, October 2015, pp.793-800.
[33]

Xin R, Ai T, Ai B. Metaphor representation and analysis of non-spatial data in map-like visualizations. ISPRS International Journal of Geo-Information, 2018, 7(6): Article No. 225.

[34]
Yang M, Biuk-Aghai R P. Enhanced hexagon-tiling algorithm for map-like information visualisation. In Proc. the 8th International Symposium on Visual Information Communication and Interaction, August 2015, pp.137-142.
[35]

Auber D, Huet C, Lambert A et al. GosperMap: Using a gosper curve for laying out hierarchical data. IEEE Transactions on Visualization and Computer Graphics, 2013, 19(11): 1820-1832.

[36]
Gansner E R, Hu Y, Kobourov S. GMap: Visualizing graphs and clusters as maps. In Proc. the 2010 IEEE Pacific Visualization Symposium, March 2010, pp.201-208.
[37]

Pang P C I, Biuk-Aghai R P, Yang M et al. Creating realistic map-like visualisations: Results from user studies. Journal of Visual Languages & Computing, 2017, 43: 60-70.

[38]
Sen S, Swoap A B, Li Q et al. Cartograph: Unlocking spatial visualization through semantic enhancement. In Proc. the 22nd International Conference on Intelligent User Interfaces, March 2017, pp.179-190.
[39]

Mashima D, Kobourov S, Hu Y. Visualizing dynamic data with maps. IEEE Transactions on Visualization and Computer Graphics, 2011, 18(9): 1424-1437.

[40]

Ma C X, Liu Y J, Zhao G et al. Visualizing and analyzing video content with interactive scalable maps. IEEE Transactions on Multimedia, 2016, 18(11): 2171-2183.

[41]

Gunes H, Schuller B. Categorical and dimensional affect analysis in continuous input: Current trends and future directions. Image and Vision Computing, 2013, 31(2): 120-136.

[42]
Arriaga O, Valdenegro-Toro M, Ploger P. Real-time convolutional neural networks for emotion and gender classification. arXiv: 1710.07557, 2017. https://arxiv.org/abs/1710.07557, March 2020.
[43]
Huang G B, Learned-Miller E. Labeled faces in the wild: Updates and new reporting procedures. Technical Report, Univ. Massachusetts, 2014. http://viswww.cs.umass.edu/lfw/lfwupdate.pdf, March 2020.
[44]
van Kreveld M, Speckmann B. On rectangular cartograms. In Proc. the 12th Annual European Symposium on Algorithms, September 2004, pp.724-735.
[45]

Plutchik R. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist, 2001, 89(4): 344-350.

[46]

Schloss K B, Gramazio C C, Silverman A T et al. Mapping color to meaning in colormap data visualizations. IEEE Transactions on Visualization and Computer Graphics, 2018, 25(1): 810-819.

[47]
Page L, Brin S, Motwani R et al. The pagerank citation ranking: Bringing order to the web. Technical Report, Stanford InfoLab, 1999. http://ilpubs.stanford.edu:8090/422/1/1999-66.pdf, March 2020.
[48]

Kosara R, Mackinlay J. Storytelling: The next step for visualization. IEEE Computer, 2013, 46(5): 44-50.

Journal of Computer Science and Technology
Pages 576-591
Cite this article:
Ma C-X, Song J-C, Zhu Q, et al. EmotionMap: Visual Analysis of Video Emotional Content on a Map. Journal of Computer Science and Technology, 2020, 35(3): 576-591. https://doi.org/10.1007/s11390-020-0271-2

414

Views

12

Crossref

N/A

Web of Science

13

Scopus

0

CSCD

Altmetrics

Received: 06 January 2020
Revised: 25 March 2020
Published: 29 May 2020
©Institute of Computing Technology, Chinese Academy of Sciences 2020
Return