The Challenges of Automation

The following was submitted in my Florida Institute of Technology Human Factors course (AHF5101). Feel free to provide your thoughts and input. Automation in Part 121 flight decks have certainly been a benefit but recent accidents highlight the degradation of pilot skills due to automation dependency and complacency. 

The current infrastructure of air transportation, throughout the world, is likely a concept that the Wright Brothers could not have imagined. Perhaps they had a basic idea of how manned-flight would alter the modes of transportation for generations to come, but were they able to comprehend that jet aircraft, largely controlled through automated systems requiring minimal interaction from operators would transport hundreds of people great distances, over oceans, all in a manner of 12 to 14 hours? The ability to cross-borders, continents, oceans, and desolate surface environments, by air with highly sophisticated jet aircraft has helped improve the economies of both the developed and developing world. The jet age has helped expand trade liberalization, it has helped improve the lives of many people, but beyond just the expansion of jet aircraft over the past four decades the improvement in aircraft systems and automation have improved the integrity of the air transport segment.

A discussion of automation cannot occur without recapping the significant technological improvements that autopilot/flight control systems have seen over the years. Improvements in ground based navigation aids, growth in widespread deployment of GPS and area-navigation (RNAV) long-range navigation tools, an increase in advanced avionic packages, and faster on-board computing power have all helped to improve the accuracy and integrity of aircraft automation systems. A century ago the idea of navigating without any reference to ground based landmarks that could be visually sighted by the pilot was an idea likely not even considered. As aviation has evolved, missions have changed, lives are transported, precious cargo moved, the requirement to have high-fidelity systems that help minimize risks to flight safety have increased. However, two recent accidents have underscored the role automation may have played in the eventually degradation of traditional manual flying processes due to a reliance on automation or even a systematic reduction in the maintenance of manual flying skills amongst professional pilots.

The first autopilot, developed by Lawrence Sperry, was a basic device that allowed for automated control of an aircraft during straight and level flight (Scheck). The system itself connected a gyroscopic heading indicator to hydraulically controlled elevators and rudders. This first autopilot, developed in 1912 and showcased in Paris in 1914 (Scheck) began the technological development wave that would launch the development of higher fidelity autopilot systems. In 1930 the Royal Aircraft Establishment in England successfully developed a “pilot assister” device that utilized a pneumatically powered gyroscope to maintain longitudinal and lateral stability (Popular Mechanics, p. 950). These basics systems allowed for automated control of an aircraft using an initial input by the pilot to activate and instruct the system to remain in straight and level flight. While these systems were welcomed within the aviation community at the time, they are merely the basic foundation of the exceptionally more advanced autopilot systems in use today.

As jet aircraft became the norm, a necessary advancement in avionics was required to help aid pilots during high altitude navigation and in maintaining aircraft control in the thinner atmospheric air at high altitude. Comprehensive flight management systems and improved radio technology allowed for greater signal reception and tracking capabilities from ground based navigation aids. Jet aircraft also were operating at speeds much faster than the previous piston/prop aircraft that many pilots at the time were transitioning from. Autopilot enhancements during the “jet-age” of American aviation included improved lateral and vertical modes. For example, while basic autopilot systems allowed for the maintenance of straight and level flight, the advanced systems allowed for control of an aircraft’s climb rate, airspeed, heading, navigation tracking, allowing for a decrease in overall pilot workload in the cockpit. However these advances have also come at a cost.

Thanks to an overall improved understanding of aerodynamics, autopilot engineers are able to design complex autopilot systems that provide transport aircraft with high levels of stability in a varying degree of flight conditions (Vaillard, et. al.). However, the benefits that arrived – such as improved navigation accuracy, reduced crew fatigue, smoother flight characteristics, there are actual negative consequences of significant automation usage. McKinney (2004) remarks on an how MD-80 crews would adjust the vertical speed setting within 1,000ft of an assigned altitude selected in the autopilot could wipe-out the autopilot’s successful capture of the altitude by leveling off due to the late change in input from a pilot. Certainly an error in the system’s use, but more importantly McKinney highlights how many MD-80 altitude busts, or vertical deviations, occurred due to important system management. McKinney further highlights how such an adjustment is not necessary considering the MD-80’s autopilot logic that allowed for a smooth level off from vertical speeds of even 4,000 feet per minute. The above example is likely a reinforcement of prior research from Boehm-Davis (2000), which acknowledged directly that automation was introduced in part to reduce error in the aviation system but that it has also introduced new errors that must be managed. More importantly, Boehm-Davis remarked that the automation has distinctly changed the roles, responsibilities, and activities of the pilots – from psychomotor flying skills to monitoring and delegating tasks to the automation.

A significant number of aviation accidents have contributing factors that directly point to mismanagement of the automatic flight control system, a flight crews attention towards figuring out what the system is actually doing rather than flying the airplane manually, or a reluctance to minimize the level of automation in use at a certain time. Consider UPS 1354 that crashed on approach to Birmingham-Shuttlesworth Intl. airport in August of 2013, the NTSB remarked that the crew’s mismanagement of the on-board flight management system, and associated automatic flight control system was a contributing factor in the crash (National Transportation Safety Board, 2014). To a certain extent pilots have become dependent on the automation to perform a certain task, but additionally there is a level of complacency that has established. It is common knowledge to those within the piloting profession that there has been an increasing amount of dependence on automation processes to accomplish the mission. Prinzel & Pope (2000) highlight that the level of complacency with automation systems in the cockpit has grown as these very systems have become more commonplace throughout the industry, but also within the mindset of self-efficacy – or one’s expectation of their ability to accomplish certain tasks. This growing dependence, after years of fairly solid performance and a belief that the system will continue to operate as previously observed, has lead to a level of trust with advanced autopilot systems which has lead to a relaxed operational environment. Complacent behavior is most likely to be present when complacency potential exists due to a higher than average workload, for instance during periods of poor weather, heavy traffic, fatigue due to poor sleep or lengthy flight segments (Parasuraman, et. al). Remarkably in Prinzel & Pope’s research it was found that individuals who rated high in self-efficacy ended up performing poorly under conditions of high-workload when it came to managing automated processes. This correlates to the growing level of complacency with automation systems due to these very systems providing a mental, and even physical, crutch to the operating pilot or crew.

Dependence on the automation is of additional concern for aviation safety professionals. Due to organizational standards many aviation operators are encouraged to utilize automation during even some of the most basic of flight maneuvers during periods of low-work load. A flight’s takeoff is indeed a crucial stage of flight, however the after-rotation climb-out is not necessary as large of a work-load contributor and yet many aviation firms highly recommend to flight crew members to utilize the autopilot to perform a constant speed climb-out and lateral navigation. This institutionalized dependence encourages flight crews to become less of a pilot and more of a system manager. Ebbattson (2010) recognized that modern jet transport aircraft are typically flown using various levels of automation, but most importantly they showcase that manual flying skills decay over time if pilots fail to take advantage of opportunities to practice those same very skills during day-to-day operations. Organizations must find a way to safely encourage crewmembers to practice manual flying skills during low-work load environments, specifically during visual meteorological conditions. Additionally, the organizations should properly assist flight crews in performing threat analysis for certain stages of flight in which case, if even during visual meteorological conditions, crews should revert to utilizing automation in an effort to minimize workload.

Few in the aviation community could remark that automation has not had a significant role in improving the overall integrity of the transport of passengers or freight. Autopilots have evolved along with the aircraft that carry them. The reduced workload capability is highly regarded as a valuable economic benefit for a number of reasons. The issue of deskilling, or the elimination of skilled labor within an economy due to technological advances – in this case automation, has impacted the professional piloting profession. Due to improved automated systems there has been a reduction in the average crew complement within air transport aircraft, down from three to now two.

These technological advances must not restrict pilots from being pilots. Parasuraman (1997) highlights a set of automation strategies to help operators. First, the operators must have better knowledge of how the automation works. Next, a clear set of policies and procedures from the organization that provides guidance on when and when not to use automation. Following these two, perhaps the most important to the issue of pilots use of automation – operators should be taught to make rational automation use decisions. There is a fourth element of the strategies, essentially that the automation should not be too difficult to turn on or off. Remarking on the third element, Parasuraman strikes exactly at the challenges many flight crews have faced due to poor automation use, or more appropriately – inappropriate use of automation when a lower level of automation (manual flying) would have been more appropriate.

A dangerous environment can develop in the cockpit when a crew is overloaded with a certain task and fails to properly manage the automation. Ultimately it must be acceptable for pilots to disengage automation if it is not accomplishing what they believe it should be performing rather than submit to the system’s potentially flawed execution. Human factors and ergonomic engineers will no doubt continue to find ways to improve the automated controls that exist in aircraft, but pilots must step up and remain committed to the very basic skills that were learned during initial pilot training. It is those very basic, and extremely analog, skills that will likely recover an aircraft from an undesired aircraft state after poor automation management. Aircraft manufactures must take steps necessary to make sure they do not develop systems that fully take the pilots out of the loop (Ropelewski). Ropelewski also highlights how the new generations of airline cockpits are posing significant challenges for human factors experts, flight training specialists, and pilots, as the evolution of automation seems to outpace the training programs at customer airlines. The advances in avionics and automation also has a significant impact on the reactions of unexpected occurrences within the system and how crews respond. Due to the dependence and complacency that has set in, improved feedback design must also be incorporated in the evolution of automation systems in air transport application (Sarter). Design advice from Rasmussen stresses that human error trapping must begin with identifying behavior-shaping system constraints with an acceptable boundary of operation (1999). System designers should not only focus on potential errors that may pop up due to poor management of the automation, but should also dedicate time and resources to studying potential interactions with the systems by real human subjects. With an outlook on not only improving the overall systems in use, but also improving the way various actors interact with the systems and supporting the notion that pilots are – and should remain to be – pilots, and not 100% dedicated to system management, may the aviation industry see a reduction in automation related accidents.


Boehm-Davis, Deborah A (2000). “Cognitive Modeling of Airline Crew Automation         Errors”. Proceedings of the Human Factors and Ergonomics Society Annual                      Meeting (1071-1813), 44 (13), p. 95.

Ebbatson, M., Harris, D., Huddlestone, J., Sears, R. (2010). The relationship between manual handling performance and recent flying experience in air transport pilots. Ergonomics, 53(2), 268-277.

McKinney, D. (2004). Automation’s unintended consequences; there’s enhanced safety in          aviation’s pervasive digitalization. but there are unwelcome surprises as well.          Business & Commercial Aviation, 94(5), 56-59. Retrieved from

National Transportation Safety Board. (2014). NTSB finds mismanagement of approach to airport and failure to go-around led to crash of UPS Flight 1354. NTSB Press Release, 9 September 2014. Retrieved from:

Parasuraman, R., Molloy, R., Signh, I. (1993). Performance consequences of automation-induced “complacency”. The International Journal of Aviation Psychology, 3(1), 1-23.

Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39(2), 230-253.

Popular Mechanics. (1930). Robot air pilot keeps plane on true course. Popular Mechanics, p. 950. Retrieved from:

Prinzel, L., Pope, A., (2000). The double-edged sword of self-efficacy: implications for automation-induced complacency. Proceedings of the Human Factors and Ergonomics Scoeity…Annual Meeting, 3, 107. Retrieved from

Rasmussen, J. (1999). Ecoglogical interface design for reliable human-machine systems. International Journal of Aviation Psychology, 9(3), 203. Retrieved from:

Ropelewski, R. (1996). Control in the cockpit: crews vs. computers. Aerospace America (0740-722X), 34(8), p. 28.

Sarter, N.B., & Woods, D.D. (1997). Team play with a powerful and independent agent: Operational experiences and automation surprises on the Airbus A-320. Human Factors (39)4, 553-569. Retrieved from:

Scheck, W. (2004). Lawrence Sperry genius on autopilot. Aviation History, 15(2), 46-52. Retrieved from

Vaillard, A., Paduano, J., and Downing, D., Sensitivity analysis of automatic flight control systems using singular-value concepts. Journal of Guidance, Control, and Dynamics, Vol. 9, No. 6, 1986, pp. 621-626.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s