Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Hum Factors ; 58(2): 322-43, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26772605

ABSTRACT

OBJECTIVE: The aim of this study was to validate the strategic task overload management (STOM) model that predicts task switching when concurrence is impossible. BACKGROUND: The STOM model predicts that in overload, tasks will be switched to, to the extent that they are attractive on task attributes of high priority, interest, and salience and low difficulty. But more-difficult tasks are less likely to be switched away from once they are being performed. METHOD: In Experiment 1, participants performed four tasks of the Multi-Attribute Task Battery and provided task-switching data to inform the role of difficulty and priority. In Experiment 2, participants concurrently performed an environmental control task and a robotic arm simulation. Workload was varied by automation of arm movement and both the phases of environmental control and existence of decision support for fault management. Attention to the two tasks was measured using a head tracker. RESULTS: Experiment 1 revealed the lack of influence of task priority and confirmed the differing roles of task difficulty. In Experiment 2, the percentage attention allocation across the eight conditions was predicted by the STOM model when participants rated the four attributes. Model predictions were compared against empirical data and accounted for over 95% of variance in task allocation. More-difficult tasks were performed longer than easier tasks. Task priority does not influence allocation. CONCLUSIONS: The multiattribute decision model provided a good fit to the data. APPLICATIONS: The STOM model is useful for predicting cognitive tunneling given that human-in-the-loop simulation is time-consuming and expensive.


Subject(s)
Attention/physiology , Models, Theoretical , Robotics , Task Performance and Analysis , Computer Peripherals , Humans , Reproducibility of Results , Space Flight , User-Computer Interface
2.
Hum Factors ; 57(5): 728-39, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25886768

ABSTRACT

OBJECTIVE: We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. BACKGROUND: Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. METHOD: Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. RESULTS: The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. CONCLUSIONS: Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. IMPLICATIONS: Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias.


Subject(s)
Automation , Decision Making , Decision Support Techniques , Man-Machine Systems , Humans , Mental Processes , Workload
SELECTION OF CITATIONS
SEARCH DETAIL
...