Avionics software testing ensures that avionics technology delivers its intended functions and that it does so safely and securely. What do industry experts consider best practices in the domain of avionics software testing? What are the standards of reference and the qualifications of those conducting the tests? And what are the current and coming testing developments? Here’s what the experts had to say.
Early and Often
One key question about avionics software testing is how often it should be conducted. There are several aspects to consider, according to Ricardo Camacho, director of safety and security compliance at Parasoft. “It is important that testing be performed as early and often as possible. For example, during requirements decomposition and architectural design, many organizations have adopted modeling because the complexity is so great that the need to abstract from text to pictures is required,” he said. “SysML or UML is the modeling language of choice. It gives the system engineer the ability to build a logical architecture, test it through simulation, further refine the design, and follow it with a physical architecture that can also be tested before handing it over to software development.”
While early testing of avionics software applications is usually performed in a simulated environment, i.e. “on-host testing,” DO-178C guidance requires the testing of software applications on the final hardware on which they will be hosted, said Nick Bowles, head of marketing at Rapita Systems. “This type of testing is known as ‘on-target’ testing and usually happens further along in the software development lifecycle (SDLC). On-target testing provides vital evidence that the software will perform as expected when hosted on the real avionics platform it is designed for.”
Across the industry, there are multiple avionics software testing techniques in use, Bowles noted. “While informal testing can take place, formal tests that count towards the certification of avionics software (such as DO-178C guidelines) should map directly to specific software requirements that are defined before software development begins,” he said. “To achieve higher-level testing, more of the final production software needs to be integrated together, so lower-level testing is possible earlier in the SDLC.”
To ensure that a sound system is built, it’s necessary to use a model execution or simulation to test the architecture and interfaces between the system’s parts, according to Camacho. “Testing will be performed again and again as the system evolves to the point that it is handed over to the software team for implementation,” he said. “In addition, system engineers define test cases on what and how the system should be tested for the quality assurance team to realize the test cases and perform the testing.”
The “early and often” principle of avionics software testing changes when the software team determines a solid and deliverable codebase and hands it over to the quality assurance team, Camacho said. “As an independent third party, the QA team exercises the code and system to flush out any unidentified bugs or functional flaws. Compliance to standards like DO-178C and others is also often required. A lot of work goes into this, and it can take a QA team many months to achieve.”
Standard of Reference
Indeed, DO-178C (ED-12C in Europe) is the primary document that provides guidance for developing airborne software systems. “DO-178 was developed in the 1970s and defined a prescriptive set of design assurance processes for use in airborne software development focused on testing and documentation,” Bowles said. “In the 1980s, DO-178A was released, which introduced the concept of different software criticality levels and prescribed different activities for different levels. Released in 2012, DO-178C clarified details and removed inconsistencies from DO-178B. DO-178C also includes supplements that provide specific guidance for design assurance when specific technologies are used.”
The DO-178C standard for airborne software places a strong emphasis on verification in general and on testing in particular, according to Benjamin Brosgol, senior software engineer at AdaCore and vice chair of The Open Group FACE Consortium’s Technical Working Group. “The standard’s approach to testing has an important distinguishing characteristic: In contrast to so-called ‘white-box’ testing, in which test cases are derived from the source code’s control structure, DO-178C specifies that testing is always based directly on the software’s requirements,” he said. “Additionally, the major change from DO-178B was not so much in the ‘core’ document but rather the formulation of specialized supplements on model-based development and verification, object-oriented technology and related techniques, and formal methods. Using any of these technologies affects the nature and extent of the requirements for testing.”
Regarding where avionics software test automation is headed, Camacho believes that the incorporation of artificial intelligence and machine learning will bring about transformations that were not previously considered. “Just as new requirements to address security concerns have been developed, I believe that new requirements around autonomous avionic systems, particularly in civil aviation, is where we will see new standards arise,” he said.
According to Brosgol, a current industry trend is the increasing emphasis on cybersecurity, and this trend can affect testing in several ways. “One is the growing usage of ‘fuzzing’ as a technique for detecting vulnerabilities.
Another is the adoption of sophisticated static analysis techniques, including formal methods, to supplement testing and to prove security-based program properties such as correct information flows,” he said.
Conducting the Tests
There is no official accreditation for performing avionics software testing, and this means that potentially anyone can do so, Bowles pointed out. “However, there are certain qualities and skills that are important to be a good avionics software tester or ‘verification engineer’ as this figure is sometimes known. A background in software or system engineering is advantageous, as understanding how a system is developed is key in being able to effectively test it,” he said. “Furthermore, some software tests might need to be written in scripting languages that require programming knowledge.”
Typically, software engineers who perform software development and software engineers who become part of the QA team are allowed to conduct avionics software testing, Camacho said. “They are ‘allowed’ because they have the software background to develop software test cases needed to verify and validate avionic systems. There is no particular training required, except being a software engineer. The only type of training required for software engineers will be on the tools that help automate and perform testing.”
To help ensure that all the test cases needed have been created, C/C++test can perform code coverage, which highlights the code that has been exercised during testing, Camacho said. “Code that is unexecuted means that there is no test case that addresses that code. There are other software tools that compile and build test code, archive test files, execute test scripts that automate testing, capture test results, and produce test reports for proof of compliance and auditing purposes. So, just to reiterate, software engineers have the necessary education and background to be allowed to conduct avionics software testing.”
Current and Coming Developments
In the evolution of testing, the drivers have been safety and security. “This appears to be further expanding into multiple condition coverage (MCC), which is more thorough than modified condition decision coverage (MC/DC) and requires a much greater number of test cases — two to the power of the number of conditions in the code statement,” said Camacho. “MCC is not officially mandated, but it’s another level of safety that could be adopted in the future.”
Security boils down to securing the data, and that data exists in different forms at various levels of abstraction within the scope of the aircraft and avionics systems, Camacho explained. “Since these avionic systems are connected, one must secure them at every entry and exit point, including down to the subsystems and units of software that exist. Data at all these levels needs to be secure,” he said. “Testing to ensure that the data is secure is done through various test methods. Some of these include security scanning with coding standards like CERT, unit testing, system testing, fuzz testing, penetration testing, brute force attacks, and checking ingress and egress points for unauthorized networks.”
The software security objectives that avionics software testing also has to satisfy include the ones described in DO-326A and ED-202A, titled “Airworthiness Security Process,” said Paul Butcher, senior software engineer and AdaCore’s lead engineer in the UK for HICLASS. “These publications, and their supporting guidelines DO-356A and ED-203A, describe testing methodologies that differ from standard verification testing and instead introduce the term ‘refutation.’ The goal behind refutation testing is to plan a test strategy that aims to refute a claim that the system is not secure. More specifically, we utilize refutation testing techniques to measure security assurance by purposefully adopting the mindset of an attacker and trying to identify and show the exploitation of any application vulnerabilities.”
There are two distinct categories in security refutation testing techniques: dynamic and static, Butcher explained. “Static analysis techniques, including source code analyzers and formal verification, aim to identify potential run-time and logic errors prior to code execution. Dynamic analysis techniques, including constraint checking run-time environments and negative testing techniques such as fuzz testing, are exercised as the application is executing,” he said. “One way to consider the difference is to think of static analysis as the act of identifying known categories of vulnerabilities within the application, whilst dynamic analysis is more about finding unknown categories of vulnerabilities. Both techniques are complementary to each other, and the recommendation is to adopt a layered approach where multiple testing methodologies are used to argue security, and therefore safety, assurance.”
Presently, one of the biggest challenges facing the industry is the testing and certification of avionics software designed for use on multicore platforms, observed Bowles. “The adoption of multicore processors in the avionics industry is growing due to their improved SWaP characteristics and the long-term supply chain issues of sourcing single-core processors,” he said. “However, the use of multicore processors for safety-critical avionics applications presents a range of challenges due to their nondeterministic behavior. Guidance addressing the testing of multicore avionics software applications has been incorporated into DO-178C (ED-12C) via EASA’s recent A(M)C 20-193 document, with the FAA’s AC 20-193 version expected to be released shortly.”
Things are headed in the direction of incorporating artificial intelligence and machine learning as part of avionics software testing, according to Camacho. “Testing is expensive for software avionics, so the next horizon in cutting cost, labor, and time needs to include artificial intelligence and machine learning. Testing where humans are involved is also error-prone, and these errors could be expensive. If artificial intelligence could analyze code and automatically determine how to test the code to ensure high-quality, safe, and secure avionic systems, that would be a dream come true for all industries developing software,” he said.
You must be logged in to post a comment.