To make API testing easier, add machine learning to your AI
By Chris Colosimo
October 2, 2018
5 min read
By adding machine learning to our AI-enabled API test creation tool, we’ve helped users take a significant leap forward in productivity and efficiency – with some added side-benefits, too!
We released Parasoft SOAtest 9.10.6 today, and it’s my great pleasure to share how we’ve continued to innovate to drive the industry towards autonomous testing, most notably with a new machine learning capability that lets you train Parasoft SOAtest’s brain by adding rules to the underlying AI in SOAtest’s Smart API Test Generator, to help SOAtest build better API test scenarios for you.
The value of adding machine learning to automatic API test creation
API testers spend a lot of time understanding how APIs work in order to build that functionality into their test cases, and a lot of that information and knowledge will never leave with that one tester. To take a giant leap forward in productivity and efficiency, you can instead automatically capture all of that specific information with SOAtest’s Smart API Test Generator, and then incrementally train it to grow with your software. How else can you benefit from SOAtest’s new machine learning capability?
- Training: Instead of spending lots of time training new API testers about how particular APIs work and how to combine them together to form meaningful test scenarios, they can use SOAtest instead. Since SOAtest already understands the relationships between APIs, new API testers can start testing APIs that are known immediately, and add value right away, rather than spending time learning how to piece everything together.
- Knowledge capture: By leveraging SOAtest’s AI-enabled API test creation tool and training it incrementally, users can define something once, lock it in with a smart test template, and share that template across a large body of testers, especially valuable in agile development given the tight timeframes for communicating how to test particular APIs to the rest of the team.
- Consistency: Every time you build an API test, you have to think about what it is you want to accomplish to validate the requirement. By having all of those connections and your authentication and validation information preconfigured for each API, everyone is in agreement about how they should be tested, and will have consistent API tests built against each interface the same way every time. This also is great for debugging because if the APIs all follow a particular pattern, it becomes easy to understand where things may break down because there’s a common set of rules.
- Coverage: Instead of trying to manage and understand the full API inventory, users can benefit from SOAtest’s smart library of resources, which helps users quickly see which APIs have been tested and add additional rules for APIs that haven’t been tested yet, so that when they do get uncovered as a part of a scenario, the resulting test is as meaningful as possible.
What Is Artificial Intelligence in Software Testing?
How does machine-learning-enabled API-test-creation work?
SOAtest starts by examining your API traffic and extracting meaningful API test scenarios. In order to facilitate the dynamic data exchange required to create these test scenarios, SOAtest’s underlying AI takes actions, such as picking up on data values that are reused through API calls and connecting them together, or extracting relevant response information and creating automatic regression tools. All of these actions can be thought of as rules.
Rules run AI
These rules aren’t just for basic test steps – they indicate relationships, such as, if I add an item to my cart with an API call, the cart ID will be important after I make a subsequent call to look at the items in my cart. SOAtest understands many of these relationships automatically; however, humans are smarter than robots (Sorry Marvin), so they can extend the rules by adding additional context to individual API resources.
So to extend our example before, while it’s important to get the cart ID, I may also want to create an assertion on the quantity of items that I added to my cart. If SOAtest didn’t already pick that up, a human could come in and add the new rule, and once we define the connection to the quantity in cart element as an assertion, every single time that API is used in the future, SOAtest will automatically populate any tests that touched that particular API with the correct details to address that requirement.
The AI helps you perform these repetitive actions by building its own awareness of how your APIs should work, and subsequently how the API should be tested. This means that by simply using it, the machine will learn all of the necessary actions to be performed on specific APIs. As a result, the cost of testing those APIs can go down over time.
Why are you hitting yourself?
There’s no reason that we should ever do the same action multiple times, when we can train a computer to do it for us. But testers do this every single day with API testing, spending excessive time understanding how APIs work and then building tests that validate the proper pieces, understanding and creating data connections, assertions, authentication, and so on.
You might be asking yourself “This sounds pretty cool, but do I now have to spend a lot of time training a machine how to test my APIs? Doesn’t this just add more to my workflow?” Thankfully, the answer is no. First, you have the built-in AI that starts by making a lot of these connections for you. Second, you don’t have to train it manually. (It’s Parasoft, after all, and we like to make things automatic.) Instead, we made a handy little connector in SOAtest, so you can leverage the work that you’re already doing. With a single click, SOAtest can take your test case, interpret it into a rule, and put it right inside of the brain. You can do this at any level, either at a single test case or the entire test file itself.
So SOAtest learns from the work that you are doing to create tests, and automatically applies that learning to create tests that do the same thing when you operate on those APIs in future test scenarios. Training the AI as you go allows you to take an incremental approach to machine learning, and everybody can contribute. And at the end of the day, you will have a rich and detailed set of resources outlining exactly what you need to do every time you test your APIs.
Other new features in SOAtest 9.10.6
Machine learning wasn’t the only thing we added to this release. Users can also look forward to the following updates to SOAtest’s Smart API Test Generator.
We added a more surgical approach to validating your payloads by adding in targeted assertions. You can think of a diff tool as a snapshot of the response payload, which immediately lets you know if data is changing. SOAtest now makes them easier to use by parameterizing dynamic values such as timestamps, as well as dynamic data that it has seen in previous calls. But if you only want to validate a particular element, that’s where the new, smart, automatic assertions come in. Equipped with the ability to analyze the patterns and relationships in the requests and responses, SOAtest checks to see if key element values are of interest, and if they are, it automatically creates a targeted assertion on that particular element, and potentially parameterizes a value. So you can take a more strategic approach to validation by ignoring the rest of the noise.
Domain inclusion and exclusion
Because there’s so much functionality built into a UI, a simple action can elicit a wide array of API calls that are out of scope of the required test, or just simply noisy. With this release, you can now include and exclude calls from specific domains, allowing you to limit the scope of what is captured by identifying the domains that are relevant to the specific scenario. Additionally, you can take the opposite approach by blacklisting certain domains that you simply don’t want to show up in your API tests.
Your tool should work for you
Our innovations in the last several months have all been about embracing the technologies of the future to simplify critical testing practices. My hope is that with these new additions to our technology, you will be able to increase your API test coverage, with a renewed excitement for building meaningful API test suites. It’s 2018, and your tool should work for you. Let it handle the nitty gritty.