Autonomous Flight
From hobbyist drones to military and commercial unmanned aerial vehicles, the world has become captivated by the possibility of autonomous flight. However, before we can jet off in a pilotless airplane, standards for increasingly autonomous aviation systems must be developed. This is where the task group on autonomy design and operations in aviation comes into play.
[Click here to read more about ASTM International's unmanned aircraft standards.]
The group brings together leaders from organizations such as NASA, Transport Canada, and the U.S. Federal Aviation Administration, as well as from ASTM International committees on light sport aircraft (F37), unmanned aircraft systems (F38), aircraft systems (F39), and general aviation aircraft (F44). Together, its members are delving into the potential of autonomous operation to improve safety for manned and unmanned aircraft. Although the group won’t develop the standards for safe autonomy, it will develop two technical reports with short- and long-term strategies for creating such standards.
“We want the standards that are produced to be consistent in regard to automation and autonomy. In particular, we want the standards to use consistent terminology and have harmonized guidance for the level of rigor of the standard and the means of compliance,” says Stephen Cook, Ph.D., Northrop Grumman technical fellow and chairman of the task group.
Speaking the Same Language
The group’s first task was to establish a common language for discussing autonomous aviation. “When I say ‘machine learning,’ your vision of machine learning may be different. We need reasonable consensus and utility of terms so that our work can move forward,” says Andrew Lacher, chairman of the subgroup on terminology and levels and senior principal systems engineer at MITRE.
To ensure this common lexicon, Lacher’s subgroup has drafted a terminology list that defines and differentiates between such terms as machine learning and artificial intelligence and automated and autonomous flight. For example, automated flight indicates that there is human involvement; should a system falter, a pilot will step in. Autonomous flight, on the other hand, has no human involvement; the system operates itself. The goals of the glossary are to reduce confusion around and aid in the conversation about autonomous systems in aviation.
How one defines safe autonomy is also important. Within the aviation community, “safe” means zero fatalities. “A key concept is whether the [autonomous] system can meet or exceed the performance of a human pilot. There are tasks for which autonomy can provide better performance, and, therefore, better safety. By deconstructing the skills and/or tasks of flying and looking at operational as well as technological risk, we believe we can get to higher levels of safety than we have today,” says Anna Dietrich, chair of the certification framework subgroup and co-founder of Terrafugia Inc.
The U.S. Federal Aviation Administration aims to attain aviation safety objectives while imposing the least possible burden on society. If properly implemented, the automation of challenging or tedious pilot tasks can lead to an increase in aviation safety without an increased burden on the populace.
Creating a Unique Schema
Along with establishing a shared vocabulary, the task group is looking at the levels of automation within
the automotive industry. By studying the efforts and outcomes of vehicular autonomy, the task group can better understand the safety benefits and risks associated with high levels of automation.
“Aviation is working to avoid situations in which the autonomy fails and just dumps control back on the human. This has been shown to be unsafe both practically and from a human factors workload perspective. This drives us to more vertically integrated autonomy in particular functional areas than has been seen in the automotive industry,” says Dietrich.
Cook adds, “Our perspective so far is that ‘levels of autonomy’ as proposed in the auto industry are overly simplified and don’t translate exactly to the role of the pilot, the air traffic controller, and, in the case of unmanned aircraft, the remote pilot. Additionally, the certification paradigm in aviation is much different than that of the auto industry.”
Concluding that aviation requires a different certification framework, the group has proposed developing its own schema for autonomy. Its framework would highlight the operating environment, system characteristics, intended function, level of human involvement, and accountability for actions. It likewise would detail safety benefits and risks. These features would provide better guidance for the operational oversight and civil aviation authority certification of autonomous systems.
“The FAA is excited about the potential safety benefits, efficiency improvement, and increased utility that automation technology will bring to aviation, particularly in cases where human-machine teaming can improve overall system performance,” says Wes Ryan, unmanned systems certification lead with the FAA.
Ryan notes that most new entrants have never designed, built, tested, or fielded operational systems with regard to aviation reliability, integrity, and safety requirements. “The key is for them to take a methodical approach to automation, showing it can improve safety and accomplish the intended mission, not automating things for automation’s sake,” Ryan says. “We have a need for automation standards so new entrants into the safety-critical aviation industry understand the requirements and expectations for reliability, availability, robustness, and safe operational use of automated functions. We also have a need for standardization of design best practices that have been proven to work in operational real-world conditions so new entrants can benefit from past mistakes. It is costly and time consuming for each company to try their own path, only to find that someone already tried their idea and was unable to implement it safely in commercial or civil service.”
As with any complex undertaking, challenges arise when dealing with increasingly autonomous aviation systems. Concerns about cyber-physical security, wireless communication and data acquisition, and human-machine integration play a major part in the discussion. So, too, do questions about the diversity of aircraft involved, airspace access for unmanned aircraft, and public fears about privacy and safety.
Phil Kenul, F38 chair, retired rear admiral, and senior vice president of aviation and operations at TriVector Services, details other considerations. This includes the design requirements and reliability of unmanned aircraft systems as well as the cost involved in bringing them to the same level of safety as manned aircrafts. He points out that unmanned systems have to be intelligent and effective to be able to detect and ensure the safety of other airborne objects.
“For the use case, is it safe enough? Small aircraft will be lower risk if you fly them properly. However, once you fly them in more complex operations, you must have a higher standard. We need consensus on how much more engineering is needed to ensure the capability and safety of unmanned aircraft systems,” Kenul says.
The task group is also focused on airworthiness certification. Of specific interest is the complex level and number of test cases involved to prove that an aircraft will operate safely under foreseeable conditions for an autonomous system. Committee F38 has already put in place a practice for methods to safely bound flight behavior of UAS containing complex functions (F3269). According to Cook, this standard offers “a fresh approach towards addressing this challenge through a run-time assurance framework.”
Ultimately, the task group’s efforts will steer the creation of ASTM International standards for increasingly autonomous aviation systems. It will shape the future of aviation.
Kathy Hunt is a U.S. East Coast-based journalist and author.