MarketAutomation bias
Company Profile

Automation bias

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral.

Disuse and misuse
An operator's trust in the system can also lead to different interactions with the system, including system use, misuse, disuse, and abuse. The tendency toward overreliance on automated aids is known as "automation misuse". Misuse of automation can be seen when a user fails to properly monitor an automated system, or when the automated system is used when it should not be. This is in contrast to disuse, where the user does not properly utilize the automation either by turning it off or ignoring it. Both misuse and disuse can be problematic, but automation bias is directly related to misuse of the automation through either too much trust in the abilities of the system, or defaulting to using heuristics. Misuse can lead to lack of monitoring of the automated system or blind agreement with an automation suggestion, categorized by two types of errors, errors of omission and errors of commission, respectively. For example, information acquisition, the first step in information processing, is the process by which a user registers input via the senses. An automated engine gauge might assist the user with information acquisition through simple interface features—such as highlighting changes in the engine's performance—thereby directing the user's selective attention. When faced with issues originating from an aircraft, pilots may tend to overtrust an aircraft's engine gauges, losing sight of other possible malfunctions not related to the engine. This attitude is a form of automation complacency and misuse. If, however, the pilot devotes time to interpret the engine gauge, and manipulate the aircraft accordingly, only to discover that the flight turbulence has not changed, the pilot may be inclined to ignore future error recommendations conveyed by an engine gauge—a form of automation complacency leading to disuse. == Errors of commission and omission ==
Errors of commission and omission
Automation bias can take the form of commission errors, which occur when users follow an automated directive without taking into account other sources of information. Conversely, omission errors occur when automated devices fail to detect or indicate problems and the user does not notice because they are not properly monitoring the system. Errors of omission have been shown to result from cognitive vigilance decrements, while errors of commission result from a combination of a failure to take information into account and an excessive trust in the reliability of automated aids. Omission errors occur when the human decision-maker fails to notice an automation failure, either due to low vigilance or overtrust in the system. For example, a spell-checking program incorrectly marking a word as misspelled and suggesting an alternative would be an error of commission, and a spell-checking program failing to notice a misspelled word would be an error of omission. In these cases, automation bias could be observed by a user accepting the alternative word without consulting a dictionary, or a user not noticing the incorrectly misspelled word and assuming all the words are correct without reviewing the words. Training that focused on the reduction of automation bias and related problems has been shown to lower the rate of commission errors, but not of omission errors. ==Factors==
Factors
The presence of automatic aids, as one source puts it, "diminishes the likelihood that decision makers will either make the cognitive effort to seek other diagnostic information or process all available information in cognitively complex ways." It also renders users more likely to conclude their assessment of a situation too hastily after being prompted by an automatic aid to take a specific course of action. A review of various automation bias studies categorized the different types of tasks where automated aids were used as well as what function the automated aids served. Tasks where automated aids were used were categorized as monitoring tasks, diagnosis tasks, or treatment tasks. Types of automated assistance were listed as Alerting automation, which track important changes and alert the user, Decision support automation, which may provide a diagnosis or recommendation, or Implementation automation, where the automated aid performs a specified task. ==Automation-induced complacency==
Automation-induced complacency
The concept of automation bias is viewed as overlapping with automation-induced complacency, also known more simply as automation complacency. Like automation bias, it is a consequence of the misuse of automation and involves problems of attention. While automation bias involves a tendency to trust decision-support systems, automation complacency involves insufficient attention to and monitoring of automation output, usually because that output is viewed as reliable. This complacency can be sharply reduced when automation reliability varies over time instead of remaining constant, but is not reduced by experience and practice. Both expert and inexpert participants can exhibit automation bias as well as automation complacency. Neither of these problems can be easily overcome by training. Take, for example, a pilot flying through inclement weather, in which continuous thunder interferes with the pilot's ability to understand information transmitted by an air traffic controller (ATC). Despite how much effort is allocated to understanding information transmitted by ATC, the pilot's performance is limited by the source of information needed for the task. The pilot therefore has to rely on automated gauges in the cockpit to understand flight path information. If the pilot perceives the automated gauges to be highly reliable, the amount of effort needed to understand ATC and automated gauges may decrease. Moreover, if the automated gauges are perceived to be highly reliable, the pilot may ignore those gauges to devote mental resources for deciphering information transmitted by ATC. In so doing, the pilot becomes a complacent monitor, thereby running the risk of missing critical information conveyed by the automated gauges. If, however, the pilot perceives the automated gauges to be unreliable, the pilot will now have to interpret information from ATC and automated gauges simultaneously. This creates scenarios in which the operator may be expending unnecessary cognitive resources when the automation is in fact reliable, but also increasing the odds of identifying potential errors in the weather gauges should they occur. To calibrate the pilot's perception of reliability, automation should be designed to maintain workload at appropriate levels while also ensuring the operator remains engaged with monitoring tasks. The operator should be less likely to disengage from monitoring when the system's reliability can change as compared to a system that has consistent reliability (Parasuraman, 1993). To some degree, user complacency offsets the benefits of automation, and when an automated system's reliability level falls below a certain level, then automation will no longer be a net asset. One 2007 study suggested that this automation occurs when the reliability level reaches approximately 70%. Other studies have found that automation with a reliability level below 70% can be of use to persons with access to the raw information sources, which can be combined with the automation output to improve performance. Death by GPS, wherein the deaths of individuals is in part caused by following inaccurate GPS directions, is another example of automation complacency. ==Sectors==
Sectors
Automation bias has been examined across many research fields. ==Correction==
Correction
Automation bias can be mitigated by redesigning automated systems to reduce display prominence, decrease information complexity or couch assistance as supportive rather than directive information. Excessively checking and questioning automated assistance can increase time pressure and task complexity, thus reducing benefit, so some automated decision support systems balance positive and negative effects rather than attempt to eliminate negative effects. ==See also==
tickerdossier.comtickerdossier.substack.com