We released Parasoft SOAtest 9.10.6 today, and it’s my great pleasure to share how we’ve continued to innovate to drive the industry towards autonomous testing, most notably with a new machine learning capability that lets you train Parasoft SOAtest’s brain by adding rules to the underlying AI in SOAtest’s Smart API Test Generator, to help SOAtest build better API test scenarios for you.
API testers spend a lot of time understanding how APIs work in order to build that functionality into their test cases, and a lot of that information and knowledge will never leave with that one tester. To take a giant leap forward in productivity and efficiency, you can instead automatically capture all of that specific information with SOAtest’s Smart API Test Generator, and then incrementally train it to grow with your software. How else can you benefit from SOAtest’s new machine learning capability?
SOAtest starts by examining your API traffic and extracting meaningful API test scenarios. In order to facilitate the dynamic data exchange required to create these test scenarios, SOAtest’s underlying AI takes actions, such as picking up on data values that are reused through API calls and connecting them together, or extracting relevant response information and creating automatic regression tools. All of these actions can be thought of as rules.
These rules aren’t just for basic test steps – they indicate relationships, such as, if I add an item to my cart with an API call, the cart ID will be important after I make a subsequent call to look at the items in my cart. SOAtest understands many of these relationships automatically; however, humans are smarter than robots (Sorry Marvin), so they can extend the rules by adding additional context to individual API resources.
So to extend our example before, while it’s important to get the cart ID, I may also want to create an assertion on the quantity of items that I added to my cart. If SOAtest didn’t already pick that up, a human could come in and add the new rule, and once we define the connection to the quantity in cart element as an assertion, every single time that API is used in the future, SOAtest will automatically populate any tests that touched that particular API with the correct details to address that requirement.
The AI helps you perform these repetitive actions by building its own awareness of how your APIs should work, and subsequently how the API should be tested. This means that by simply using it, the machine will learn all of the necessary actions to be performed on specific APIs. As a result, the cost of testing those APIs can go down over time.
There’s no reason that we should ever do the same action multiple times, when we can train a computer to do it for us. But testers do this every single day with API testing, spending excessive time understanding how APIs work and then building tests that validate the proper pieces, understanding and creating data connections, assertions, authentication, and so on.
You might be asking yourself “This sounds pretty cool, but do I now have to spend a lot of time training a machine how to test my APIs? Doesn’t this just add more to my workflow?” Thankfully, the answer is no. First, you have the built-in AI that starts by making a lot of these connections for you. Second, you don’t have to train it manually. (It’s Parasoft, after all, and we like to make things automatic.) Instead, we made a handy little connector in SOAtest, so you can leverage the work that you’re already doing. With a single click, SOAtest can take your test case, interpret it into a rule, and put it right inside of the brain. You can do this at any level, either at a single test case or the entire test file itself.
So SOAtest learns from the work that you are doing to create tests, and automatically applies that learning to create tests that do the same thing when you operate on those APIs in future test scenarios. Training the AI as you go allows you to take an incremental approach to machine learning, and everybody can contribute. And at the end of the day, you will have a rich and detailed set of resources outlining exactly what you need to do every time you test your APIs.
Machine learning wasn’t the only thing we added to this release. Users can also look forward to the following updates to SOAtest’s Smart API Test Generator.
We added a more surgical approach to validating your payloads by adding in targeted assertions. You can think of a diff tool as a snapshot of the response payload, which immediately lets you know if data is changing. SOAtest now makes them easier to use by parameterizing dynamic values such as timestamps, as well as dynamic data that it has seen in previous calls. But if you only want to validate a particular element, that’s where the new, smart, automatic assertions come in. Equipped with the ability to analyze the patterns and relationships in the requests and responses, SOAtest checks to see if key element values are of interest, and if they are, it automatically creates a targeted assertion on that particular element, and potentially parameterizes a value. So you can take a more strategic approach to validation by ignoring the rest of the noise.
Because there’s so much functionality built into a UI, a simple action can elicit a wide array of API calls that are out of scope of the required test, or just simply noisy. With this release, you can now include and exclude calls from specific domains, allowing you to limit the scope of what is captured by identifying the domains that are relevant to the specific scenario. Additionally, you can take the opposite approach by blacklisting certain domains that you simply don’t want to show up in your API tests.
Our innovations in the last several months have all been about embracing the technologies of the future to simplify critical testing practices. My hope is that with these new additions to our technology, you will be able to increase your API test coverage, with a renewed excitement for building meaningful API test suites. It’s 2018, and your tool should work for you. Let it handle the nitty gritty.
A Product Manager at Parasoft, Chris strategizes product development of Parasoft’s functional testing solutions. His expertise in SDLC acceleration through automation has taken him to major enterprise deployments, such as Capital One and CareFirst.