Tips for Oracle Middleware Testing and Migration
August 20, 2015
4 min read
With Oracle middleware as the heartbeat of your application integration strategy, making a change to this infrastructure can have heart-stopping consequences. A migration to a new platform or a new version of the platform is even more precarious since new version features could cause undetected impacts. Leveraging simulation technology (service virtualization) and automated API (service) testing technologies in these scenarios can dramatically reduce the risk associated with change or migration.
There are three primary challenges associated with ensuring that Oracle middleware is achieving goals associated with system reliability, security, and performance:
- Exercising a complete end-to-end workflow is impeded by the high number and complexity of system dependencies involved in a typical end-to-end transaction.
- Continuously testing that workflow as the system is being developed or migrated is complicated by the challenge of constructing, initiating, and evaluating tests for such a specialized environment.
- Realistic performance for dependent applications is difficult to achieve in a test environment due to access constraints and system complexity, but it is nevertheless critical for achieving thorough and accurate validation—and thus more effective optimization— prior to deployment.
With comprehensive support for working within Oracle ecosystems, Parasoft’s integrated API (service) testing and service virtualization (lab management) solution addresses these challenges, enabling you to effectively test your Oracle middleware as you’re integrating and evolving it.
“Shift Left” End-to-End Testing
By providing a comprehensive, easy-to-use solution for enabling and continuously executing end-to-end testing through Oracle middleware, Parasoft enables organizations to test earlier, faster, and more completely.
Parasoft’s industry-leading support message/protocol support (over 120 service and message types, including those common in Oracle environments) enables complex tests and validations to be rapidly constructed from an intuitive graphical test construction and management interface. Tests are constructed to support fully-automated, continuous regression testing—alerting you to unexpected changes while ignoring insignificant differences. Moreover, as systems evolve, automated intelligent updating helps you keep test assets in sync with changes.
Moreover, if any dependencies (ERP, database, mainframes, third-party services, etc.) are not yet implemented or are not readily-available in a test environment, they can easily be replaced with “virtual assets” that assume the appropriate behavior, data, and performance profiles. For dependencies that are difficult to configure for specific testing needs (e.g., due to access constraints or the specialized nature of the technology), service virtualization gives developers and testers the freedom to easily control their behavior as needed to complete a broad set of tests, including negative testing, corner cases, various performance scenarios, and so on. With a simulated test environment, team members and partners can have secure, 24/7 access to complete test environments.
When validating whether end-to-end transactions satisfy performance expectations, teams need to test and tune the AUT against realistic and consistent performance from dependencies. Yet, this can be particularly difficult to achieve in an Oracle ecosystem since your ability to configure dependencies may be limited by access constraints as well as system complexity. With service virtualization, the performance of each dependency is completely under your control. It’s simple to configure and adjust the performance of “virtualized” dependencies to check various what-if scenarios—providing a fast and easy way to apply the various performance profiles needed to truly exercise the AUT.
Case Study: Testing Oracle Fusion Middleware SOAP Services
At a recent engagement for a telecommunications client, IntegrationQA implemented a Parasoft-driven framework to validate Oracle Fusion middleware SOAP requests/replies. The same library of pre-compiled audit checks, monitoring tools, test cases, and stress injectors applied here can be used as the foundation for running tests against any Oracle Fusion middleware service—at any organization.
Audit and Baseline
To establish a system for validating changes and baselining the middleware before executing each test cycle, IntegrationQA built an import facility that automatically extracts the Oracle Fusion configuration into Parasoft SOAtest. Checking the baseline provides a powerful tool to monitor change within the middleware and ensure that tests provide adequate coverage when changes occur. Using Parasoft SOAtest to audit the front-side handler definition (WSDL) validation of the schema, semantic and WS-I interoperability was checked automatically.
Isolating the Middleware
To reduce the dependency on any back-end service and provide detailed transactional information, IntegrationQA configured virtual assets to simulate the back-end requests/responses. “Ring fencing” the middleware with Parasoft virtual assets significantly lowered costs by reducing the number of test systems required. Additionally, reducing the dependency on downstream systems increased productivity by lowering downtime. Testing could begin much earlier with the new solution; in fact, it could begin well before the downstream system was complete.
API Testing, Regression Testing, and Continuous Testing
A set of tests covering each operation in the WSDL was generated in minutes using Parasoft SOAtest‘s automated test generation capabilities. IntegrationQA then linked a pool of test data into these tests, enabling the organization to test a wide variety of scenarios from each API test.
For regression testing, Parasoft SOAtest’s automated generation of regression controls was used to record the responses from the middleware. With the results between each test cycle automatically compared, the organization could rest assured that unexpected changes would be quickly exposed.
For continuous testing, the library of tests was configured to run automatically at regularly-scheduled intervals. This customer wanted to automate testing through the Windows Scheduler, but test execution can be initiated from any DevOps tool or Tests Management system that the organization prefers.