Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Selective visual attention determines what pedestrians notice and ignore in urban environment. If consistency exists between different individuals’ visual attention, designers can modify design by underlining mechanisms to better meet user needs. However, the mechanism of pedestrians’ visual attention remains poorly understood, and it is challenging to forecast which position will attract pedestrians more in urban environment. To address this gap, we employed 360° video and immersive virtual reality to simulate walking scenarios and record eye movement in 138 participants. Our findings reveal a remarkable consistency in fixation distribution across individuals, exceeding both chance and orientation bias. One driver of this consistency emerges as a strategy of information maximization, with participants tending to fixate areas of higher local entropy. Additionally, we built the first eye movement dataset for panorama videos of diverse urban walking scenes, and developed a predictive model to forecast pedestrians’ visual attention by supervised deep learning. The predictive model aids designers in better understanding how pedestrians will visually interact with the urban environment during the design phase. The dataset and code of predictive model are available at https://github.com/LiamXie/UrbanVisualAttention
Al Mushayt NS, Dal Cin F, Barreiros Proença S (2021). New lens to reveal the street interface. A morphological-visual perception methodological contribution for decoding the public/private edge of arterial streets. Sustainability, 13: 11442.
Amati M, Ghanbari Parmehr E, McCarthy C, et al. (2018). How eye-catching are natural features when walking through a park? Eye-tracking responses to videos of walks. Urban Forestry & Urban Greening, 31: 67–78.
Amati M, McCarthy C, Parmehr EG, et al. (2019). Combining eye-tracking data with an analysis of video content from free-viewing a video of a walk in an urban park environment. Journal of Visualized Experiments, 147: e58459.
Attneave F (1954). Some informational aspects of visual perception. Psychological Review, 61: 183–193.
Batool A, Rutherford P, McGraw P, et al. (2021). View preference in urban environments. Lighting Research & Technology, 53: 613–636.
Borji A, Itti L (2013). State-of-the-art in visual attention modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35: 185–207.
Borji A (2021). Saliency prediction in the deep learning era: successes and limitations. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43: 679–700.
Bruce NDB, Tsotsos JK (2009). Saliency, attention, and visual search: An information theoretic approach. Journal of Vision, 9(3): 5.
Crosby F, Hermens F (2019). Does it look safe? An eye tracking study into the visual aspects of fear of crime. Quarterly Journal of Experimental Psychology, 72: 599–615.
Duchowski AT (2017). Eye Tracking Methodology: Theory and Practice. Cham, Switzerland: Springer
Dupont L, Ooms K, Antrop M, et al. (2016). Comparing saliency maps and eye-tracking focus maps: The potential use in visual impact assessment based on landscape photographs. Landscape and Urban Planning, 148: 17–26.
Fawcett T (2006). An introduction to ROC analysis. Pattern Recognition Letters, 27: 861–874.
Franěk M, Petružálek J, Šefara D (2019). Eye movements in viewing urban images and natural images in diverse vegetation periods. Urban Forestry & Urban Greening, 46: 126477.
Gholami Y, Taghvaei SH, Norouzian-Maleki S, et al. (2021). Identifying the stimulus of visual perception based on Eye-tracking in Urban Parks: Case Study of Mellat Park in Tehran. Journal of Forest Research, 26: 91–100.
Goldstein EB, Cacciamani L (2021). Sensation and Perception. Boston, MA, USA: Cengage Learning.
Henderson JM, Hayes TR (2017). Meaning-based guidance of attention in scenes as revealed by meaning maps. Nature Human Behaviour, 1: 743–747.
Higuera-Trujillo JL, López-Tarruella Maldonado J, Llinares Millán C (2017). Psychological and physiological human responses to simulated and real environments: A comparison between Photographs, 360° Panoramas, and Virtual Reality. Applied Ergonomics, 65: 398–409.
Hollander JB, Purdy A, Wiley A, et al. (2019). Seeing the city: using eye-tracking technology to explore cognitive responses to the built environment. Journal of Urbanism: International Research on Placemaking and Urban Sustainability, 12: 156–171.
Hollander JB, Sussman A, Purdy Levering A, et al. (2020). Using eye-tracking to understand human responses to traditional neighborhood designs. Planning Practice & Research, 35: 485–509.
Hollander JB, Sussman A, Lowitt P, et al. (2021). Eye-tracking emulation software: a promising urban design tool. Architectural Science Review, 64: 383–393.
Hollander JB, Sussman A, Lowitt P, et al. (2023). Insights into wayfinding: urban design exploration through the use of algorithmic eye-tracking software. Journal of Urban Design, 28: 274–295.
Kim G, Yeo D, Lee J, et al. (2023). Simulating urban element design with pedestrian attention: Visual saliency as aid for more visible wayfinding design. Land, 12: 394.
Koch K, McLean J, Segev R, et al. (2006). How much the eye tells the brain. Current Biology, 16: 1428–1434.
Kruthiventi SSS, Ayush K, Babu RV (2017). DeepFix: A fully convolutional neural network for predicting human eye fixations. IEEE Transactions on Image Processing, 26: 4446–4456.
Li Z, Sun X, Zhao S, et al. (2021). Integrating eye-movement analysis and the semantic differential method to analyze the visual effect of a traditional commercial block in Hefei, China. Frontiers of Architectural Research, 10: 317–331.
Mackworth NH, Morandi AJ (1967). The gaze selects informative details within pictures. Perception & Psychophysics, 2: 547–552.
Mital PK, Smith TJ, Hill RL, et al. (2011). Clustering of gaze during dynamic scene viewing is predicted by motion. Cognitive Computation, 3: 5–24.
Noland RB, Weiner MD, Gao D, et al. (2017). Eye-tracking technology, visual preference surveys, and urban design: preliminary evidence of an effective methodology. Journal of Urbanism: International Research on Placemaking and Urban Sustainability, 10: 98–110.
Parsons TD (2015). Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Frontiers in Human Neuroscience, 9: 660.
Renninger LW, Coughlan J, Verghese P, et al. (2005). An information maximization model of eye movements. Advances in Neural Information Processing Systems, 17: 1121–1128.
Rudenko S, Danilina N, Hristov B (2021). Using a mobile eye-tracking technology to explore pedestrians’ gaze distribution on street space. E3S Web of Conferences, 263: 05015.
Simpson J, Freeth M, Simpson KJ, et al. (2019a). Visual engagement with urban street edges: insights using mobile eye-tracking. Journal of Urbanism: International Research on Placemaking and Urban Sustainability, 12: 259–278.
Simpson J, Thwaites K, Freeth M (2019b). Understanding visual engagement with urban street edges along non-pedestrianised and pedestrianised streets using mobile eye-tracking. Sustainability, 11: 4251.
Spiers HJ, Maguire EA (2008). The dynamic nature of cognition during wayfinding. Journal of Environmental Psychology, 28: 232–249.
Treisman AM, Gelade G (1980). A feature-integration theory of attention. Cognitive Psychology, 12: 97–136.
Võ MLH, Henderson JM (2009). Does gravity matter? Effects of semantic and syntactic inconsistencies on the allocation of attention during scene perception. Journal of Vision, 9: 24–24.
Walther D, Koch C (2006). Modeling attention to salient proto-objects. Neural Networks, 19: 1395–1407.
Wang W, Shen J (2018). Deep visual attention prediction. IEEE Transactions on Image Processing, 27: 2368–2378.
Wang W, Shen J, Xie J, et al. (2021). Revisiting video saliency prediction in the deep learning era. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43: 220–237.
Wiener JM, Hölscher C, Büchner S, et al. (2012). Gaze behaviour during space perception and spatial decision making. Psychological Research, 76: 713–729.
Yarbus AL (1967). Eye Movements and Vision. New York: Springer.
Yuan G, Wang H, Wang M, et al. (2022). Visual attention and ethnic landscape perception: a case of three cities in the Guangdong– Hong Kong–Macao greater bay area. Applied Geography, 147: 102781.