Autonomous driving is a very competitive space, and developer velocity is a mantra. Whoever can first bring a certified product to market will have a significant advantage over the competition, and so it’s easy for developers to view static analysis and other quality initiatives as an obstacle. Especially because the field of autonomous driving is hungry for talent, organizations are hiring smart developers, even if they don’t have any background in safety. But developers that come from a background without a functional-safety culture are not aware of all of the quality processes that are required for safety-critical software development. This can make cultural buy-in a challenge.
Sometimes it feels like building internal consensus for quality-oriented practices requires a masters degree in Psychology, or the skills of a trained negotiator… In past projects, I’ve been responsible for introducing static analysis and AUTOSAR C++ 14 coding standards compliance as a sustainable process. Autonomous driving software is very innovative, and software components that are used for it are developed using modern C++. With that in mind, the AUTOSAR C++ 14 coding standard is the most appropriate standard for autonomous driving software because it supports modern C++ and was created for safety-oriented development.
To convince the unconvinced, I’ve given multiple presentations discussing multiple different aspects. But still, even with all these discussions and agreements, some developers resist analyzing all the code they create. Here are some of the main points I focus on to get the right processes in place:
Autonomous driving technology is at a very early stage. A lot of the source code created is just a prototype to test new ideas. Some developers don’t want to “waste” time making it coding standard compliant — they just want to write something quickly and test it. A typical story I’ve heard is, “I just want to test this new algorithm, if it works I’ll rewrite the code to make it clean.” This sounds innocent enough, but the reality is that it’s just growing technical debt.
When we move from prototype to prototype, we usually take a lot of code with us, statistically it can be up to 80% of the code. So we can’t afford to prototype something with bad code and then fix it later, because from the very beginning, we’ve known that we won’t have time for that. So even if something is a prototype, the code must be compliant because the final product is going to contain a significant amount of code that was initially created as a prototype.
If instead of focusing on doing it now instead of doing it later, you can focus on doing it now to avoid spending an unknown amount of time at the end, developers start to see it as enhancing the process instead of introducing a slowdown. If you can efficiently fit the process into the developer’s workflow without slowing velocity or creativity, it becomes much easier to work with. It’s much easier to keep something tidy and clean as you go, than it is to clean up a huge mess, at the end. Building consistent, maintainable practices for writing compliant code will help you find less of a mess later.
But even if you are able to successfully introduce a coding standard compliance process early on, there will inevitably already be some (significant) amount of code that was already created by the team, and some that was inherited. And while you are selecting the static analysis tool (Parasoft C/C++test) and picking the standard (AUTOSAR), in the meantime the team is creating a lot of code without any compliance policies! So it’s important to also create a policy for legacy code. Two great policies are:
With these strategies, you can address legacy code, introduce new code, and continue to keep the place tidy.
To be successful in adopting static analysis and coding standards compliance processes across the development organization, you will benefit from doing the following:
Ijaz is a functional safety engineer with 14+ years of experience in software development/testing. His recent experience has focused in self-driving technology, leading the effort to implement ISO 26262, and building simulated scenarios to rigorously test self-driving technology.