Introduction
Automation has been the centre of technology across the world. Various institutions have made rapid advances in automation giving rise to robotics, autonomous vehicles, autonomous web applications, as well as frameworks and decision aids focusing on the user experience. These advancements have brought several changes in the job markets and various impacts in the daily lives of people. According to Johnson et al. (2004), by 2030 automation shall have replaced 30% of the workforce with about 14% of the workers shifting their occupations. Looking at the current trend, this projection by Johnson et al. (2004), could be accurate as of today, about 15% of the tasks performed across the world are fully automated. Automation involves data generation, transformation, and manipulation. With the help of technologies such as artificial intelligence and big data, automation can provide operational speed and efficiencies which will improve reliability and sustainability, as well as increase production. With these abilities, automation has made it possible to complete tasks in hazardous and difficult environments. Despite the capabilities of automation, humans still play a key role in its success.
The studies undertaken by Hoffman et al. (2013), has proved that automation does not replace the human activities but instead, alters their operations in various ways to pose different demands upon the human operators. It has extended the capacity of humans to achieve tasks that seemed impossible. As the systems grow in complexity, the abilities of the human operators diminish. Lee and See (2004), identifies trust as one of the major contributing factors that influence the reliance of the human operators on automation. Lee and See (2004), point out that trust is important in determining the willingness of an individual to rely on automation in various situations of uncertainty which are common with automated processes. Irrespective of the robust nature of these automated systems, it is always likely that at one given instance, they will fall short of human expectations. For this reason, it is vital to appropriately calibrate the actual performance of the given system for trust (Feldhutter et al., 2016; Schaefer et al., 2016).
In most cases, too much trust in these systems has led to the human operator dependency on the automation in performing out of parameter tasks. Trust has also led to operator tasking the automated systems even in situations when the systems are faulty. On the other hand, lack of trust for these automated systems have led to automation disuse leading to overloads of operators with work while diminishing the performance of the system (Lee & See, 2004). Thus, facilitation of the appropriate trust levels in automation is always important to ensure maximization of the performance of the system as well as the safety of the human operators (Hoff & Bashir, 2015; Stokes et al., 2010).
Trust plays an integral role between automation and human operators. Various existing literature defines trust as the state that involves confidence about the motives of another concerning oneself in risky situations (Dzindolet et al., 2003). However, these scholars admit that there is no universally agreed definition of trust. Nonetheless, the few definitions that exist stems from various domains and thus depends on the perspective from it is defined. Despite the diversity, there is a good deal of consistency in the key aspects from which trust is defined (Parasuraman, Sheridan & Wickens, 2008). For this reason, trust exists in various states under which given factors come into play.
Trust as an Observable Choice Behaviour
Muir and Moray (1996), observes that, more often, trust is expressed as the willingness to engage in activities and behaviours which are relevant to the agent. In most cases, it is also expressed as an observable choice behaviour where people are at will to make choices that leads to trust. The same is true with trust in automation. As a choice behaviour, the human operators can conceptualize trust from both the relational and the rational perspectives. From a rational standpoint, trusting the automation systems can come from motivation as well as the desire to make rational and efficient choices. Such decisions to trust the automation systems often represents gains as well as losses that come as a result of the developed trust. However, as an observable choices behaviour, the development of trust for automation is always a product of choice that comes with the benefit of believing in automation.
Trust as a Psychological State
Merritt and Ilgen (2008), insinuates that trust is best conceptualized as a multidimensional psychological state that involves the beliefs as well as expectations that are relevant to the trustworthiness of the trustee. The beliefs, as well as the expectations, are derived from the interaction and experiences with the trustees. Thus, in automation, it is clear that, for the successful execution of tasks between the automated system and the human operator, the cognitive and the affective features of trust must be expressive. The process of developing trust between the automated system and the human operator, in a cognitive perspective often involves the reception of the information as well as knowledge about the given automated system regarding its behaviours in specific situations. Over some time, the information turned into a big data which can then be analyzed by the use of artificial intelligence and elaborated into the views in which the human operator can understand, perceive and do on a consistent basis (Gold et a., 2015; Parasuraman & Miller, 2004). Thus, the cognitive process plays a vital role in the automation domain. It can help in determining the trust expectations as in the study by Salem et al., (2015), where it was shown that the expectation of competence about a given machine was captured in the level of trust in which the operators had established with the machine. Thus, from the psychological perspective, the extent to which automation is expected to perform the task which it was designed to do is a major factor that influences its trustworthiness.
Limitations of the Research
The need to trust automation often arises from antecedents that influence trust in general. Logically, trust has always been an issue in situations that involve risk, vulnerability, uncertainty, as well as in those that need independence. Thus, in the interaction with automation, the antecedents are no less prominent. Automation is vital in making complex tasks easier. In this way, it enables individuals to achieve more in a short duration of time. However, automation leads to over-dependence by the human operators. The operators always expect the automated systems to function when needed. In the events of failure, the operators' risks not completing the required task leaving them vulnerable to negative outcomes since the cost of not performing a required duty is always significant.
Similarly, according to Neigel et al. (2018), in the today's world, working with automation requires an implicit acceptance of the vulnerabilities associated with it, leaving the operators open to the uncertain outcomes that come forth with automation in cases it fails to perform the required tasks. For this reason, trust in automation is not much of an issue according to some automation theorists. To the theorists, trust in automation is a misnomer and often amounts to nothing. They have widely viewed trust in automation as the use of what works perfectly well in a reliable manner and the subsequent dismissal of what does not suit the requirements of the operator. Mostly, the operators base their allocation behaviour on the properties of the automation which revolves around the working and the usage of the system. If the automated system is faulty, the operators shun away from its usage. Thus, trust is relative to the reliability of the automation. When the automation is reliable, trust is always high, and the automation is most likely to be used. The automation theorists also argue that, in automation, trust is often more than just fault detection. The view of trust, as well as the corresponding trust behaviour, is based on more than the properties of the agent, is supported by both empirical and theoretical evidence. According to the existing research, trust is not affected by the automation faults in cases where the faults are known in advance thus is not affected whatsoever by the properties of automation (Muir, 1994; Pop, Shrewsbury & Durso, 2015).
Future of the Research
The importance of trust in any given relationship cannot be disputed. Trust is an inanimate object with unique properties. However, most of the existing literature on trust in automation fails to articulate the existing differences between trust in automation and trust in humans. One may argue that the referent of trust in humans or automation varies. However, the modes through which trust is constructed must be considered as they are likely to differ depending on the domain in question. Researchers like Rice (2009); Ranjan et al. (2016) and Shankar (2008), have argued that there are similarities between trust in automation and trust in humans. The researchers went ahead to ask people to rate specific attributes in regards to their understanding of trust in humans and automation.
A percentage of the data collected showed there exist convergent attributes through which a general understanding of trust in humans and automation can be based. However, the research also highlighted some differences that exist between the trusts in automated systems and trust in humans. Humans have unique traits. Emotions and experiences mostly control them. Thus, it is not easy to create a relationship between the trust in automation and trust in humans. Also, the relationship between people and the automated systems cannot conform to the conventional definition of relationship. From the conventional definition of relationship, it ought to be reciprocal from both parties. However, between automation and humans, there is no reciprocation as it is a one-sided relationship and there is no mutual contribution towards the benefits of the relationship. The working together to build trust between two individuals as it is always in real life is not also applicable in the relationship between automation and humans. The systems are programmed to perform consistently over a given period of time but not to build relationship and trust with humans.
References
Dzindolet, M. T., Peterson, S. A., Pomranky, R. A., Pierce, L. G., & Beck, H. P. (2003). The role of trust in automation reliance. International journal of human-computer studies, 58(6), 697-718.
Feldhutter, A., Gold, C., Huger, A., & Bengler, K. (2016). Trust in Automation as a Matter of Media Influence and Experi-ence of Automated Vehicles. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 2024-2028). Sage CA: Los Angeles, CA: SAGE Publications.
Gold, C., Korber, M., Hohenberger, C., Lechner, D., & Bengler, K. (2015). Trust in automation-Before and after the experience of take-over scenarios in a highly automated vehicle. Procedia Manufacturing, 3, 3025-3032.
Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, 57(3), 407-434.
Hoffman, R. R., Johnson, M., Bradshaw, J. M., & Underbrink, A. (2013). Trust in automation. IEEE Intelligent Systems, 28(1), 84-88.
Johnson, J. D., Sanchez, J., Fisk, A. D., & Rogers, W. A. (2004). Type of automation failur...
Cite this page
Essay Sample on Factors Affecting Trust in Automation. (2022, Oct 19). Retrieved from https://proessays.net/essays/paper-example-on-factors-affecting-trust-in-automation
If you are the original author of this essay and no longer wish to have it published on the ProEssays website, please click below to request its removal:
- Valuing Environmental Impacts of Development - Paper Example
- Research Paper on Artificial Intelligence in Healthcare
- Philosophical Considerations on the Future of Artificial Intelligence Essay
- Artificial Intelligence Essay Sample
- Essay Sample on Ethical Issues Related To Nuclear Energy
- Increase the Efficiency of Generating Power from Solar Sphere Design - Essay Sample
- The Debate Of AI & Robotics Vs Human Labor: Pros & Cons