Making Autonomous Flight a Reality

Kathy Hunt

Almost from the first moment humankind took to the skies, we have searched for a way to make flight, and flight safety, less dependent upon human actions. The earliest successful attempt at automating flight occurred in 1912, when the New York-based Sperry Gyroscope Company developed the gyroscopic stabilizer apparatus or Sperry Autopilot. Integrated into the aircraft’s hydraulic control system, it enabled a plane to maintain a straight and level course without constant pilot intervention. Lawrence Sperry demonstrated this autopilot device at the 1914 Paris Concours de la Sécurité en Aéroplane (Airplane Safety Competition) by performing a stable fly-by while he and his mechanic stood on the moving plane’s wings.

This astonishing display inspired further developments in automated flight, including pairing autopilot with navigation systems to enhance flight at night and in adverse weather. This, in turn, propelled the first trans-Atlantic flight done completely on autopilot in 1947. Autopilot, and other advancements, would result in reduced pilot workload, fatigue, and error – and improved air safety. 

The move toward automated and autonomous flight spurred new regulations and regulatory agencies. In Europe, the European Union Aviation Safety Agency handles civil aviation safety, while in the United States, the Federal Aviation Administration (FAA) enforces the Federal Aviation Regulations (FAR). Found in Title 14 Code of Federal Regulations (CFR), these rules cover everything from hot-air balloons to unmanned aircraft systems (UAS), and aircraft design to pilot training. They likewise outline general operating and flight rules (Part 91) for general aviation pilots. What these directives may not address is if an automated or autonomous system, and not a human, is in charge of a flight. 

FOR YOU: The Promise of Urban Air Mobility

Flight Regulations and Autonomous Systems

ASTM International’s advisory committee AC377 works to ensure that aviation regulations are compatible with autonomous systems. Made up of members from ASTM’s committees on unmanned aircraft systems (F38), aircraft systems (F39), general aviation aircraft (F44), and aerospace personnel (F46), AC377 looks at autonomy in all aspects of aviation. Airplanes, air taxis or eVTOLS (electric vertical take-off and landing), and drones – as well as the design and operations of these vehicles – are all under its purview. 

In 2020, the committee published its first technical report, TR1, which focuses on terminology and definitions and a common framework of certification requirements. It also parses out the differences between such terms as “automated” and “autonomous.” Automated indicates onboard human involvement in a flight. If a system should falter on an automated flight, a pilot will step in. In autonomous flight, the system operates on its own and can independently determine a new course of action in the absence of a predefined plan. It does not require human decision making in the operation or management of a flight. 

The second report, TR2, suggests and describes the fundamental principles or technical pillars of complex aviation systems development. It evaluates past common best practices for implementing automation into aviation and considers how they could be utilized in this new age of autonomy. Both publications look at automation and autonomy from the aspect of design. 

In March 2022, AC377 released its third technical report, TR3, which explores autonomy through the lens of operations. To create this document, committee members analyzed 3,171 lines of Title 14 CFR Part 91—General Operating and Flight Rules. Their goal was to determine whether a fully autonomous aircraft would be able to comply with the FAA’s operating and flight rules as they are currently written. Because Part 91 only applies to aircraft within the U.S. National Airspace System, ultralight vehicles, moored and unmanned free balloons, kites, amateur rockets, and small UAS are not included in AC377’s report.

Intended for use by technical experts in standards-developing organizations as well as regulators, industry stakeholders, and academia, TR3 details AC377’s process of analysis. It indicates which sections of Title 14 CFR Part 91 could present barriers to the widespread adoption of largely automated and autonomous aviation operations. It also offers possible ways of overcoming these deterrents. 

“One of the themes carrying through all AC377 technical reports is the idea of functional breakdown, of treating the relationship of automation and autonomy with an aircraft and its human operators on a function-by-function basis,” says Anna Dietrich, founder of the nonprofit Community Air Mobility Initiative, industry consultant, and a lead author of TR3-EB.

Much of the language in current regaulations was written with a human pilot in mind.

Enabling Autonomous Aviation

Although TR3 focuses exclusively on Part 91, its suggested actions can be applied to other parts of Title 14 CFR. The reason that TR3 could aid additional aviation regulations has to do with language. Depending on how a regulation is worded, the language itself can serve as a barrier to autonomy.

“All of the existing regulations were written with the assumption that there is a person on board who is directly responsible for that airplane,” Dietrich explains. “Even for applications where we have significant use of autopilot systems and significant automation and autonomy in the aircraft, there is the assumption that we still have human fallback on the plane. We looked at Part 91 to see what function was being assigned to a human pilot and whether that function could be easily satisfied as written by a remote operator supervising that flight on the ground, by delegating it to autonomy on board the aircraft, or if neither solution was a viable path for how the rule was written.”

Dave Stevens, certification project manager at Joby Aviation, notes the use of human-centric language in Part 91 and how it limits the use of autonomous operations. 

“In some of the wording, there is the implication that communication is to be a real-time, spoken interaction with passengers as opposed to listening to a recording or reading a pamphlet. Radio communication presents another barrier. A pilot’s radio communication with a tower or other aircraft implies vocal, real-time speaking. If it didn’t say ‘radio communication’ and instead just said ‘communication,’ there could be a digital link that conveys far more than voice communications can. How it’s written now, this wouldn’t be allowed,” says Stevens, who is vice chair of F44 (general aviation aircraft), subcommittee chair of F44.50 (systems and equipment), and a lead writer of TR3.

According to the findings in TR3, 85 percent of the regulatory language in Part 91 provides no barriers. However, 15 percent of the wording does thwart autonomy. This hurdle has been categorized into three groups: “slows process,” “needs small adjustment,” and “large barrier.” The first, slows process, maintains that, although the regulatory language is not at issue, the implementation of the underlying assumptions could present challenges. Slows process accounts for one percent of language impediments.

Ten percent of Part 91’s regulatory language falls under the heading of “needs small adjustment.” This includes the use of human-centric words such as “person” and “crewmember,” both of which prohibit non-human actions and the need to reflect technological advancements in design solutions. Regarding technological advancements, it was suggested that the information and equipment required onboard should be revisited. In the case of remote piloting, the rule(s) could be reworded to allow for the “availability of information” instead of the “physical presence” of equipment that a remote pilot would not be using.  

The Definition of Pilot

The remaining four percent of the stumbling blocks to autonomy have been labeled as “large barrier.” The majority deal with the regulatory definition of “pilot in command.” In Title 14 CFR Part 91, this term refers specifically to a person who carries out certain activities on an aircraft. Because pilot-in-command indicates onboard human involvement, the phrase excludes an automated or autonomous system from performing those tasks. 

“If you fix the definition and say ‘A pilot-in-command is a person or an appropriately certified automated system,’ the regulatory barrier goes away. If you expand what you mean by pilot-in-command, those words don’t have to change. The legal definition of the word is what really creates the conflict,” Stevens says.  

Before suggesting a change to this definition, other considerations were taken into account. 

“One of the big questions that we addressed was whether an automated system could take on this role of 14 CFR 91.3, which is a very pilot-centric and human-centric regulation,” says Stephen Cook, chair of AC377 and Northrop Grumman technical fellow in airworthiness. 

A subpart of Part 91, 91.3 pertains to the responsibility and authority of the pilot-in-command. Subpart 91.3 states the following: “The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of that aircraft. In an in-flight emergency requiring immediate action, the pilot in command may deviate from any rule of this part to the extent required to meet that emergency. Each pilot in command who deviates from a rule under paragraph (b) of this section shall, upon the request of the Administrator, send a written report of that deviation to the Administrator.”

To aid in answering the question of whether an automated or autonomous system could take on the responsibilities of subpart 91.3, the advisory committee wrestled with the different roles that a pilot-in-command performs. 

READ MORE: A New Era for UAS Standards

“We considered things such as his or her authority, liability, and accountability. Today, we have the ability to ask a pilot, ‘Why did you do that?’ Would we be able to do that with an automated system? Is it actually safer to have an automated system or a human performing the function? How mature is the automation tech? These are some of the questions and challenges we identified,” says Cook.

The Impact of AC377’s Technical Reports

Although TR3 does not propose regulations, it does provide information and guidance to aid in developing industry standards that address the means of compliance for autonomy in aviation. 

“Our hope is that with the three documents published thus far from AC377, we move as an industry toward writing technical requirements for autonomous systems and that standard writers take this functional breakdown approach. Then you could have an airplane certification package that acknowledges that different functions on that airplane are autonomous to different extents. It’s more of a structural approach to how we are recommending that standards be structured as opposed to ‘Go write a standard about X, Y, Z technical thing,’” Dietrich says. 

The technical report also exposes areas that should be discussed by industry and regulators. 

“We’d like for these technical reports to drive conversation, to drive dialogue with the goal of making aviation safer. Autonomy is still a relatively new type of idea and looking at it through the lens of the operational regulations, of the technical pillars, and of terminology and requirements framework, all of these things serve to move everyone together in the right direction,”
Cook says.

Stevens adds, “We are saying to the regulators and lawmakers that, if you want to enable autonomous flight as the future of aviation, here are suggestions for what you can do to fix not only Part 91, but also many of the other parts as well. It’s targeted to making suggestions for how the regulators can remove some of these barriers in really simple ways that would keep the safety intent of the regulation in place but get the language out of the way to allow autonomous systems to introduce themselves.” ■

Kathy Hunt is a U.S. East Coast-based journalist and author.

Industry Sectors

Issue Month
Issue Year