Parasoft Logo

WEBINAR

Object Code Structural Coverage for DO-178​C

Achieving DO-178C compliance for software in avionics systems presents unique challenges, especially when it comes to ensuring thorough structural coverage at the object code level. This presentation explores practical methods to streamline this process, focusing on how to reduce testing effort and accelerate time to market.

Learn methods to focus testing and reduce redundant effort when testing for structural machine code coverage as described by DO-178C 6.4.4.2. High-level source languages, like C++, are ever evolving. The easier it is for a developer to express themselves through a high-level source language, the harder it is to trace the equivalent compiler-generated machine code.

Guided by examples, this presentation takes an iterative approach to the process of producing complete structural coverage of machine code and shows best practices in action to reduce time to market.

Key Takeaways

  • Simple categorizations for structural machine code coverage deficiencies.
  • Example solutions for common categories of untested machine code.
  • Strategies to minimize effort involving machine language debugging.

Understanding DO-178C Challenges

DO-178C compliance involves several key objectives that can be influenced by your software’s Design Assurance Level (DAL). These include:

  • Bi-directional Requirements Traceability: Ensuring a clear link from requirements to test cases, code, and reviews.
  • Compliance Objective Satisfaction: Identifying and marking off necessary objectives.
  • Target Hardware Validation: Testing on the actual system that will be certified.
  • Coding Standard Compliance: Adhering to standards like MISRA, AUTOSAR C++14, or custom rules.
  • Data and Control Coupling Analysis: Performing and documenting this analysis if required by your DAL.
  • Structural Code Coverage: Verifying that all requirements have been tested.
  • Assembly Code Coverage: Meeting coverage requirements at the assembly level if mandated.
  • Tool Qualification: Use tools that come with a Tool Qualification package for your development ecosystem.
  • Compliance Management Plan: Having a clear plan and expert guidance for the certification process.

Parasoft offers solutions to address these challenges, including integrations with ALM tools for traceability, support for various coding standards, and on-target hardware verification capabilities.

Explore how DO-178C structures the software compliance process in our comprehensive guide.

Object Code Verification: A Deeper Dive

Achieving correctness at the object code level is a significant task. The strategy here is to focus on small, isolated “gap sections” of deficient structural coverage. This involves applying object instrumentation to the results of source-level testing practices.

By instrumenting at the source level for requirements, MCDC (Modified Condition/Decision Coverage), and on-target testing, you can get very close to complete object-level structural coverage without directly examining the object code initially. Cumulative reports can then combine results from various testing methods with object-level instrumentation.

The MCDC Measurability Difference: Source vs. Assembly

Consider a simple A OR B expression. Achieving 100% MCDC coverage at the source level requires three test runs to isolate A and B. When this C code is compiled, the resulting assembly might have different branching structures.

If you start MCDC testing from scratch at the assembly level, you might miss a branch statement coverage if you omit a necessary test run. However, if you’ve already performed the source-level MCDC testing, that third run is likely already included, providing complete assembly-level coverage without extra effort.

Deficient Structural Object Code Coverage for Conditional Branching

Compilers often introduce implicit checks, boundary checks, or optimizations that aren’t directly apparent at the high-level source code. This can lead to insufficient coverage at the source level alone.

For example, testing a C function might achieve 100% MCDC coverage at the source level. However, when examining the assembly output, you might find a 97% branch coverage, with a specific jump statement (e.g., jump if above to L51) not being exercised.

This deficiency could be due to an out-of-bounds enum value check introduced by the compiler. To address this, you can create a specific test case with a parameter that forces the enum value outside its expected range. By running this isolated test and merging its coverage results with your existing MCDC coverage, you can amend the gap and increase the overall branch coverage.

Structural Coverage for Instructions Without Source Code Representation

Some compiler-generated constructs, like file formatting, functional interfacing, or system checks (e.g., stack smashing protection), may not have a direct representation in the source code. To gain coverage for these, you’ll need to instrument the production executable itself.

For instance, if a function ends with a stack check fail call, and this path is not covered by your unit tests, you would instrument the production executable. By running this instrumented executable and manipulating its state (e.g., modifying a register value like RCX to force a jump to fail), you can exercise the deficient path. The coverage data from this execution can then be merged with your existing coverage reports, achieving 100% instruction coverage for that section.

Strategies for Success

  • Prioritize Source-Level Testing: Make every effort to complete testing goals at the high-level source code, MCDC, and functional/production testing modes. This significantly reduces the number of gaps you need to address at the assembly level.
  • Focus on Gap Analysis: Constraining your focus to specific deficiencies makes it easier to find the necessary execution paths and document them.
  • Leverage Collaborative Coverage: By amending coverage with small, unique portions for specific testing purposes, you promote a more collaborative effort where individuals can contribute partial coverage without overlapping work on shared code.
  • Automate Instrumentation: Use toolchains that can augment the original toolchain to inject assembly instructions for recording execution paths. This can be integrated into build processes like Makefiles or CMake.
  • Merge Coverage Data: Coverage data from different sources (unit tests, production executables) can be merged by applying coverage recordings to a common abstract syntax tree (AST) representation of the assembly code. This allows for cumulative coverage reporting and can even be used to assert the equivalence of assembly code produced by unit tests versus production builds.