An official website of the United States government
A .mil website belongs to an official U.S. Department of Defense organization in the United States.
A lock (lock ) or https:// means you’ve safely connected to the .mil website. Share sensitive information only on official, secure websites.

From weeks to minutes: How AI is accelerating the flight test process

  • Published
  • By Tech. Sgt. Robert Cloys
  • Air Force Test Center

Before a single aircraft lifts off the runway at Edwards Air Force Base, engineers must complete a mountain of documentation.

Test plans, hazard analyses, evaluation frameworks, and technical reports are all part of the process that ensures flight tests are conducted safely and produce meaningful results. Drafting those documents can take hours, days, and sometimes weeks.

A new artificial intelligence tool is beginning to compress that timeline dramatically.

At the Air Force Test Center, engineers are using the AI Flight Test Assistant, known as AFTA, to rapidly generate first drafts of the documents that support the flight test process. The tool allows engineers to spend more time on analysis, planning, and the execution of complex tests, and less on drafting, wordsmithing, and compiling information.

AFTA began as a tool designed to generate test documentation, but the platform has since evolved into a broader workflow system.

“The AI Flight Test Assistant is a cloud-based tool that uses generative AI to augment labor-intensive test and evaluation processes,” said Jordan Conner, an AFTC AI Implementation lead and the “father” of AFTA. “Initially it was just a document generator, but now it functions as a no-code workflow editor where users can build their own custom AI-automated processes.”

Mission owners can tailor these AI assistants for their organization’s custom needs and upload document repositories, informing how the system drafts new material. A role-based access control system provides additional oversight, allowing teams to manage how the tool is used across different projects within their organization.

The goal is straightforward. Reduce the administrative burden that often slows the early stages of test planning.

“Engineers still must review everything and bring the final product to completion, but AFTA can get them to a solid first draft very quickly,” Conner said.

In one example, an operational tester at the Air Force Operational Test and Evaluation Center built a custom workflow that automated the generation of operational test measures. The task previously required more than 20 hours of manual work. Using AFTA, the system generated the first draft in less than two hours, with less than 5 minutes of initial human input.

The process runs in the background, allowing engineers to continue other work while the first draft document is produced.

Another user from the 96th Test Wing created a Rough Order of Magnitude generator in less than ten minutes using AFTA’s workflow editor. The tool can now produce the first draft of a ROM document in under a minute. Previously, generating that document required several people working for hours.

These types of efficiencies can have a substantial impact on how quickly testing can move forward. Before conducting any flight test, engineers must produce documentation that defines how the test will be conducted, how data will be collected, and how risk will be managed. By accelerating the creation of those documents, tools like AFTA can help reduce the time required to prepare for testing.

In an era where military advantage increasingly depends on speed of decision, adaptation, and capability development, reducing time-to-test helps ensure new technologies reach operational forces faster.

Maj. Gen. Scott Cain, commander of the Air Force Test Center, said accelerating the pace of testing is essential to maintaining an operational advantage.

“Speed matters,” Cain said. “Our ability to test, learn, and adapt faster than potential adversaries allows us to deliver credible capability to the warfighter. Tools that help our engineers move faster while maintaining rigorous testing standards are critical to that effort.”

Since its introduction, AFTA has expanded rapidly across the Department of the Air Force. More than 800 users across the service are now experimenting with the platform, with more than 30 organizations building custom workflows to support their own processes.

Earlier this year, AFTA was demonstrated during the Air Force Operational Test and Evaluation Center AI Technology Showcase. Government attendees were asked which application presented during the event would be most useful for their organizations.

AFTA ranked first.

Despite the growing interest in artificial intelligence, developers emphasize that tools like AFTA are designed to assist engineers rather than replace them.

“AI will get you to a strong first draft,” Conner said. “But humans are always in the loop. Engineers must still review, edit, and validate everything before it moves forward, just as they would without the help of AI”

Christopher Hereford, who supports AI implementation efforts across AFTC, said tools like AFTA should be viewed the same way as any other piece of software.

“Any AI application is just a tool,” Hereford said. “Even MS Word still does things I don’t understand. We can expect the same from language models and applications built on them. It is not a panacea.”

The system also differs from conversational AI tools such as GenAI.mil. While those platforms function as general purpose chat assistants, AFTA focuses on structured and repeatable workflow automation. Users provide the same reference documents they would normally use to write test documentation, and AFTA follows predefined processes and policies to generate consistent outputs that engineers review and refine.

That repeatability is important in a testing environment where documentation must remain consistent, traceable, and defensible.

As artificial intelligence continues to mature, leaders across the Department of War are increasingly focused on how it can accelerate decision making and capability development. Within the test enterprise, tools like AFTA represent one example of how AI can support the work of engineers responsible for preparing the next generation of Air Force systems for operational use.

By reducing the time required to complete routine documentation tasks, engineers gain more time to focus on analysis, innovation, and preparing complex tests.

And in a testing environment where every day saved can accelerate the delivery of new technology to the warfighter, even small gains in efficiency can make a difference.