Human Factors and alert systems
The Maui fires were Hawaii’s deadliest, killing 106 people, many on the motorway trying to escape. Over 2,200 structures were destroyed, 678 acres of land was affected and countless amounts of human (and animal) suffering.
Many are now asking why the alert system that is located throughout Maui, wasn’t used to advise the community early. There isn’t enough information yet to unravel this question, but the system has clearly failed, and failure of alert systems can happen for many reasons.
Human performance in alert systems is researched using signal detection theory (SDT) Baysean analysis and fuzzy SDT (Parasuraman &. Masalonis 2017). Signal detection theory presents scenarios as to whether a signal is present and whether a response is observed. From the table in figure 1. you can see the green responses are desired; if there is an emergency and we alert, that is good. If there is not an emergency and we don’t alert that is good. But alerting when there isn’t an emergency (false alarm) creates distrust and the alert will likely be ignored in the future. The miss is what we saw in Maui; the signal (in this case wildfires) was present, but the observer did not respond.
Figure 1. Signal detection theory table
Figure 2. Signal detection theory graph
The reason for this could be many, but it can be created by having too many false alarms so that the operator turns them off due to the high rate of noise. Maui has one of the largest outdoor alarm systems in the world, due to the country being vulnerable to tsunami, and yet it failed to alert the people in time. When using automation (which these systems are based) there is a trade-off between accurate reporting (hit) and false alarms. There is a measure (using some fancy maths) of sensitivity in the systems with the aim or reaching close to 100% detection with little false alarms. Because sensing technologies are vulnerable to noise, a fuzzy SDT outlines the possibility of a non-binary response that is ambiguous and complex to detect; it could be a signal or it could be a false alarm. Therefore, humans are usually still needed to make the decision to alert or not. This is also where human error occurs.
“In most real environments, the definition of a signal as observed by the human operator is context dependent and varies with several factors (Parasuraman &. Masalonis 2017)”. In other words, alerts are ambiguous and humans make errors based on schemas, beliefs and previous experience with the system.
It is unclear at this stage if the error was technology based or human based or both, but either way it is one that we want to avoid in the future. Psychological research now suggests we train and treat technology as a team mate and that we develop team cognition to research new paradigms and theories to strengthen this relationship (Cooke et al 2023).
References
Cooke, N.J., Cohen, M.C., Fazio W.C., Inderberg L.H., Johnson, C.J., Lematta, G.J., Peel, M., Teo, A. (2023) From teams to teamness: future directions in the science of team cognition.
Parasuraman, R., Masalonis, A.J., (2017) Designing automated alerting systems: standard and fuzzy signal detection theory and Baysian analysis. Proceedings of the IEA 2000 HFES conference.