A self-described “accidental technologist,” Justyn Trenner acts as the Corporate Development Director for QA Financial, a company focused on DevOps, automation, and AI technologies. Founded in 2015, the firm seeks to augment how e-commerce and financial firms monitor, discover, and address software development, testing, and delivery.
With such a focus, it’s no wonder that some of their partners include companies from across the globe including Eggplant, IBM, and Parasoft. Their partnership with the Bank of England focuses on digital resilience and the role of inspectable benchmarks in DevOps.
QA Financial recognizes that automation and automated testing are key tools in mitigating risk in the development process. After all, if you can catch issues early, you can better avoid deployment disasters and data breaches.
Quality failures can lead to costly fixes and also loss of unquantifiable things like faith in products or services. Though an important part of software development, how can you demonstrate the effectiveness of a quality engineering approach?
More importantly, how do you establish a return on investment (ROI) to discern what the right budget and effort investments are to promise consistent quality performance?
“API testing and fixing can account for double the quality spending per defect than other defects. Data provisioning is also a key source of wastage (ineffective development) and duplicate spending.”
—Justyn Trenner, QA Financial
Leveraging QA Vector® Analytics has helped firms determine their optimal spend allocation, which vendor they should use, and which operating model is best for them. Though AI plays a key role, the foundation of the methodology revolves around risk prediction, resilience, and on-time performance.
When asking the questions related to operational resilience, the matter of shorthand standards is a critical one. It centers around several key questions.
QA Financial’s QA Vector Analytics provides the means to address these questions thus improving digital resilience with inspectable benchmarks.
Learn how your team can implement these benchmarks to better show ROI when it comes to quality. Watch Justyn’s presentation, Benchmarking the Value of Quality Engineering, at the 2021 Automated Software Testing & Quality Summit. See why the old saying, “Speed. Cost. Quality. Pick two, because you can’t have all three,” doesn’t hold true anymore when it comes to software.
Quantifying how much you should spend on quality efforts can seem difficult to do. The data shows that API testing and fixing can account for double the cost per defect than other kinds of defects. This results in wasted time, effort, and budget on inefficient procedures and easily fixed issues that could’ve been dealt with beforehand.
While it may seem like a good approach to cut corners or stick with what you’ve always used at first, it may end up costing you dearly in money, talent, client base, or all three in the end. QA Vector Analytics empowers financial firms and institutions to measure quality performance against rivals and peers across projects and development approaches.
When trying to create parseable standards for quality ROI measurements, the biggest hurdle was seeing how a team’s methodology affected development. After all, what works well for one team might be totally impossible for another.
This traditional methodology for software development is often used by enterprise-level companies for its linear flow that has defined goals.
Before the next phase can start, the previous one must be totally completed. There’s also no option to go back and modify the project. This rigidity can result in costly and slow development.
The Agile methodology is a common option that allows development teams to minimize risks like cost overruns, bugs, and requirement changes as they add new functionalities. This means that teams push changes in small increments and iterations. Examples include crystal, scrum, feature-driven development (FDD), and more.
This condensed process typically carries low investment costs while still producing a high-quality product. RAD contains just four phases. While quick and great for projects with well-defined objectives, RAD requires a very stable team of expert developers.
Such a deep knowledge requirement and the ability to work and pivot quickly when needed may not be for every team.
DevOps is both a set of practices to support organizational culture and a development methodology. At its core, this approach focuses on collaboration, improving time to market, and reducing failure rates for new releases.
Automating continuous delivery often means automating continuous testing, which helps teams catch bugs and potential threats earlier in the process. However, manual testing is still a crucial part of the process in tandem with automated testing.
Shifting to DevOps thinking that you can fire quality experts or thinking you can just turn them into developers leads to a lack of testing expertise and a high cost of fixing since your most expensive engineers are doing all that fixing.
—Justyn Trenner, QA Financial
QA Vector Analytics provides the insights companies can use to streamline their workflows and methodologies.
The image above displays how well companies performed with different models. For instance, you’ll see that FinTechs using the hybrid model scored very well for defect escape overall with trivial defects being their best performance area. However, when you look at critical defects, hybrid model FinTechs scored poorly.
The reason for this is the “fix-fast” epic where there is a Release Date, and then a Release Date 2 just a few days later. This puts users as testers. A risky proposition if you’re concerned about security risks, data breaches, and the like.
As with case studies such as Parasoft’s Caesars Entertainment and Test Automation ROI Measurement, there’s a focus on demonstrating ROI to leadership with QA Vector Analytics. Aiming to achieve all three points of the triangle is more than possible with this new approach. Make it so your future self says “thanks” and invest in your quality journey with API testing, service virtualization, and more.
Joy Ruff is a Product Marketing Manager focused on product positioning and marketing content for Parasoft's functional testing tools. With over 25 years of experience, she has provided technical marketing and sales enablement for various enterprise hardware and software solutions.