Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Hum Factors ; 65(5): 833-845, 2023 08.
Article in English | MEDLINE | ID: mdl-34292078

ABSTRACT

OBJECTIVE: We controlled participants' glance behavior while using head-down displays (HDDs) and head-up displays (HUDs) to isolate driving behavioral changes due to use of different display types across different driving environments. BACKGROUND: Recently, HUD technology has been incorporated into vehicles, allowing drivers to, in theory, gather display information without moving their eyes away from the road. Previous studies comparing the impact of HUDs with traditional displays on human performance show differences in both drivers' visual attention and driving performance. Yet no studies have isolated glance from driving behaviors, which limits our ability to understand the cause of these differences and resulting impact on display design. METHOD: We developed a novel method to control visual attention in a driving simulator. Twenty experienced drivers sustained visual attention to in-vehicle HDDs and HUDs while driving in both a simple straight and empty roadway environment and a more realistic driving environment that included traffic and turns. RESULTS: In the realistic environment, but not the simpler environment, we found evidence of differing driving behaviors between display conditions, even though participants' glance behavior was similar. CONCLUSION: Thus, the assumption that visual attention can be evaluated in the same way for different types of vehicle displays may be inaccurate. Differences between driving environments bring the validity of testing HUDs using simplistic driving environments into question. APPLICATION: As we move toward the integration of HUD user interfaces into vehicles, it is important that we develop new, sensitive assessment methods to ensure HUD interfaces are indeed safe for driving.


Subject(s)
Automobile Driving , Eye Movements , Humans , Accidents, Traffic
2.
IEEE Trans Vis Comput Graph ; 28(8): 2834-2851, 2022 08.
Article in English | MEDLINE | ID: mdl-33315569

ABSTRACT

Augmented reality (AR) offers new ways to visualize information on-the-go. As noted in related work, AR graphics presented via optical see-through AR displays are particularly prone to color blending, whereby intended graphic colors may be perceptually altered by real-world backgrounds, ultimately degrading usability. This work adds to this body of knowledge by presenting a methodology for assessing AR interface color robustness, as quantitatively measured via shifts in the CIE color space, and qualitatively assessed in terms of users' perceived color name. We conducted a human factors study where twelve participants examined eight AR colors atop three real-world backgrounds as viewed through an in-vehicle AR head-up display (HUD); a type of optical see-through display used to project driving-related information atop the forward-looking road scene. Participants completed visual search tasks, matched the perceived AR HUD color against the WCS color palette, and verbally named the perceived color. We present analysis that suggests blue, green, and yellow AR colors are relatively robust, while red and brown are not, and discuss the impact of chromaticity shift and dispersion on outdoor AR interface design. While this work presents a case study in transportation, the methodology is applicable to a wide range of AR displays in many application domains and settings.


Subject(s)
Augmented Reality , Automobile Driving , Smart Glasses , Computer Graphics , Humans , User-Computer Interface
3.
Appl Ergon ; 96: 103510, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34161853

ABSTRACT

While researchers have explored benefits of adding augmented reality graphics to vehicle displays, the impact of graphic characteristics have not been well researched. In this paper, we consider the impact of augmented reality graphic spatial location and motion, as well as turn direction, traffic presence, and gender, on participant driving and glance behavior and preferences. Twenty-two participants navigated through a simulated environment while using four different graphics. We employed a novel glance allocation analysis to differentiate information likely gathered with each glace with more granularity. Fixed graphics generally resulted in less visual attention and more time scanning for hazards than animated graphics. Finally, the screen-fixed graphic was preferred by participants over all world-relative graphics, suggesting that graphic spatially integration into the world may not always be necessary in visually complex urban environments like those considered in this study.


Subject(s)
Augmented Reality , Automobile Driving , Humans , Motion
4.
Front Robot AI ; 6: 98, 2019.
Article in English | MEDLINE | ID: mdl-33501113

ABSTRACT

Optical see-through automotive head-up displays (HUDs) are a form of augmented reality (AR) that is quickly gaining penetration into the consumer market. Despite increasing adoption, demand, and competition among manufacturers to deliver higher quality HUDs with increased fields of view, little work has been done to understand how best to design and assess AR HUD user interfaces, and how to quantify their effects on driver behavior, performance, and ultimately safety. This paper reports on a novel, low-cost, immersive driving simulator created using a myriad of custom hardware and software technologies specifically to examine basic and applied research questions related to AR HUDs usage when driving. We describe our experiences developing simulator hardware and software and detail a user study that examines driver performance, visual attention, and preferences using two AR navigation interfaces. Results suggest that conformal AR graphics may not be inherently better than other HUD interfaces. We include lessons learned from our simulator development experiences, results of the user study and conclude with limitations and future work.

SELECTION OF CITATIONS
SEARCH DETAIL
...