Parasoft Logo

How to Deploy a Learning Mode Proxy for Automatic Service Virtualization

Jump to Section

Overview

Service virtualization is essential for testing early. It uncorks a major bottleneck in the testing process and helps make shift-left and continuous testing possible. But the downside is that creating virtual services takes time.

Organizations new to the practice also need a way to demonstrate quick wins that inspire teams to adopt it. If presented correctly, your team can quickly see that service virtualization is an indispensable strategy for overcoming constraints in their test environments.

This guide introduces a low commitment, hands-off, approach to service virtualization that can automatically fail-over to simulated API responses based on continuous recording.

A Simple Way to Start With Service Virtualization

A core component of service virtualization that makes this possible is the message proxy, which acts as a man-in-the-middle between an application under test and a downstream service.

Message proxies are used to monitor and record traffic, as well as direct the flow of traffic between real and virtual endpoints. One of the advanced features of the message proxy in Parasoft Virtualize is Learning Mode. It continuously learns from recorded traffic and maintains a simulation of the responses it has seen to mitigate scenarios like the real service going down or reducing costs from pay-per-transaction dependencies.

For situations where advanced logic or fine-grained control of the virtual responses is unnecessary, Learning Mode eliminates the time commitment of creating and maintaining virtual services.

Ready to get started with service virtualization? Request a Demo »

Step-by-Step Implementation Guide

Unlike use cases where developers and testers are prototyping or simulating a service for local testing, Learning Mode is best applied in shared test environments. Ones where unstable, or costly, downstream dependencies affect teams’ abilities to test.

The entire process of virtualizing a service with Learning Mode often takes less than an hour. With the help of an Infrastructure or DevOps engineer familiar with deployments into your test environments, it can go faster than you expect. The key steps are:

  1. Configuring a Virtualize message proxy with Learning Mode enabled.
  2. Injecting the message proxy endpoint into your test environment. This is so it can intercept the requests your application under test is making to the downstream API you want to virtualize. This is the step benefited by someone familiar with how your software is deployed.

The following diagram describes how the Virtualize message proxy fits into a system’s architecture:

Virtualize message proxy in system architecture.

Setting Up the Parasoft Virtualize Environment

Parasoft’s flexible server-based deployments are ideal to infuse service virtualization into your existing test environments, with options that include:

  • Docker images that are publicly available on DockerHub.
  • Kubernetes deployments including Helm Charts.
  • Pre-packaged images on AWS and Azure Cloud.
  • Good old-fashioned .war deployments with Apache Tomcat.

The primary components of the solution are:

Virtualize Server

This is the server where message proxies and virtual assets get deployed, to be integrated into your test environment.

Continuous Testing Platform (CTP)

This is a web-based administration and user portal for Parasoft Virtualize.

Virtualize Desktop

This is a service virtualization desktop application with a user-friendly UI and AI Assistant chatbot for power users working on virtual services with more sophisticated requirements on response behavior.

This guide assumes you have already deployed and licensed Parasoft CTP and Virtualize server. It also assumes you have identified an application under test with a downstream API dependency that you would like to virtualize.

Note: Networking is a key pre-requisite for injecting endpoints hosted by Parasoft Virtualize into your test environment. Prior to proceeding, it is important to check that the appropriate ports to your Virtualize server are open (default: 9080/9443) and that your Virtualize server can make outbound connections to other services in your test environment. Furthermore, some test environments enforce HTTPS where the self-signed certificate Virtualize ships with is insufficient; in these cases, you may need an SSL certificate generated for the Virtualize server.

Deploying the Learning Mode Proxy

The first step in the process is creating a message proxy on your Virtualize server. This can be done with Parasoft CTP using a web browser.

After logging in to CTP, you will see the Environment Manager Workspace page.

Screenshot of the Environment Manager Workspace page after logging into Parasoft CTP.

Click the Environment Manager menu on the top left and select Service Virtualization.

Screenshot showing how to open the Environment Manager in the menu.

You will be redirected to a page that provides a thin-client interface to your Virtualize server.

Screenshot showing a page that provides a thin-client interface to your Virtualize server.

Right click on the Virtualize server you have connected to CTP and select Create Message Proxy.

Screenshot showing the menu when you right click on the Virtualize server.

Give the message proxy a name and click Save.

Screenshot showing the text box to name your message proxy.

You will see your new Message Proxy added as a node under your Virtualize server.  Right click the node and select Add HTTP Connection.

Screenshot showing how the new Message Proxy is added as a node under your Virtualize server.

This is where you will configure the message proxy, whose listener endpoint will be injected in-between your application under test and service to be virtualized.

Screenshot showing where to configure the message proxy

Define a Proxy listen path for your message proxy. This completes the message proxy’s endpoint that you will use to replace the endpoint of the service to be virtualized in the deployment configuration of your application under test.

Screenshot of inputs for defining a proxy listen path.

Next, enable the Use fallback connection checkbox and fill out the form fields for host, port, and path. Use host.virt.internal as shorthand for the Virtualize server host. This configures a secondary connection where the virtual asset will be automatically deployed. When the real service becomes unavailable on the primary connection, the message proxy will fail-over to the secondary connection that points to the virtual service deployed on the Virtualize server.

Screenshot of forms you fill out for Use fallback connection and host, port, and path.

Click Save, and then right-click the message proxy node and click Enable.

Screenshot example of proxy node form fill.

The message proxy on your Virtualize server is now active and ready to be integrated into your test environment.

Integrating the Learning Mode Proxy with your Application Under Test

At this point, an Infrastructure or DevOps engineer will be needed to help you re-configure the deployment of your application under test so that it points to your message proxy endpoint instead of directly to the service you want to virtualize.

The example in this guide is based on the Parabank demo application, which has a convenient admin web page for dynamically switching the downstream API the application depends on.

Keep in mind the following steps are specific to the Parabank demo application, and most application deployments have a properties file or secret that defines external connection endpoints. This is where a one-time change needs to be made so that upon re-deployment the application will then talk to your Virtualize server’s message proxy as a passthrough to the downstream service.

Select the message proxy you created under the Virtualize server and copy the Proxy Connection Settings.

Screenshot of the message proxy connection settings.

In the case of Parabank, the Admin Page lets you conveniently reconfigure the REST Endpoint it depends on without needing to change a configuration file and redeploying the application.

Screenshot of Parabank shoing how it lets you conveniently reconfigure the REST Endpoint.

Monitoring the Learning Process

Before letting the IT or DevOps engineer go, you want to confirm that the changes made to the deployment of your application have not resulted in any noticeable change of behavior. It should now be directing its downstream API traffic to your message proxy, and the proxy should be forwarding those requests along to the real service your application depends on. Monitoring the traffic of the message proxy is a convenient way to make sure everything is working as expected.

From the Service Virtualization page in CTP, navigate to Events.

Screenshot of the Service Virtualization page in Parasoft CTP

Find your message proxy in the list of deployments on the Virtualize server, and make sure both the checkbox and monitoring icon are enabled.

Screenshot showing how to find your message proxy in the list of deployments on the Virtualize server.

You may see some Event Messages already, but for the time being click Clear to filter the log into a clean slate before you begin testing the messaging flow between your application under test, the message proxy, and the downstream service.

Screenshot showing to clear event messages before you begin testing.

At this point, go ahead and exercise your application under test and then return to the Events page in CTP to view the monitored traffic. In the case of Parabank, we will login.

Screenshot of login and home page

Coming back to the CTP Events page, we see a notification that new event messages are available.

Screenshot showing example CTP Events page notifications.

There will typically be a pattern of 4-5 logged messages relating to a request-response interaction.

Closeup screenshot of Proxy messages.

Request Received

The proxy receives an incoming request.

Proxy Request Sent

The proxy forwards the request to its destination endpoint.

Info 

In the case of learning mode, an event gets logged when the proxy records the traffic to disk.

Screenshot of events logged in learning mode.

A screenshot of the proxy receiving a response from the designation endpoint.

Proxy Response Received 

The proxy receives a response from the destination endpoint.

Response Sent

The proxy forwards the response back to the client application.

Learning Mode Use Cases

There are two use cases with the learning mode message proxy.

  1. The primary service connection is set to the real service, while the fail-over connection is set to the virtual service.
  2. The primary service connection is set to the virtual service, while the fail-over connection is set to the real service.

When the primary connection is set to the real service, your application primarily depends on the real service while testing. This mode is the closest to how you were testing before service virtualization. Except now when there is instability with that downstream service, you are no longer blocked from testing because a virtual service is filling in until the real service becomes available again.

When the primary connection is set to the virtual service, your application primarily depends on the virtual service while testing. When transactions with that service have a cost, this mode can be very advantageous in reducing how much money is being spent to support testing.

Troubleshooting and Optimization

Proxy Connectivity Issues

The most common challenge is connectivity. These issues usually fall into 3 categories:

  1. Endpoint misconfiguration
  2. Port is not open
  3. SSL/TLS issue

Monitoring the event messages from your message proxy can be very helpful in troubleshooting connectivity issues.

For example:

  • If you do not see any requests coming from your application under test, then you should:
    • Check to make sure you can make a client request from the application server to the message proxy on your Virtualize server with a tool like cURL. If this fails, then perhaps there is a network issue blocking clients from your application server from making requests to the Virtualize server. The default port for Virtualize is 9080, but the message proxy also supports custom HTTP listeners, and you should ensure the correct ports are open on your Virtualize server.
    • Double check to make sure the Virtualize server’s message proxy endpoint is correctly configured on your application under test’s deployment settings.
  • If you see requests from your application under test, but you do not see any responses from the downstream service, then you should:
    • Check to make sure you can make a client request from your Virtualize server to the downstream service with tools like cURL, a Parasoft Virtualize Provisioning Action, or Parasoft SOAtest REST Client. If this fails, then perhaps there is a network issue blocking clients from your Virtualize server machine from making requests to the downstream service.
    • Double check to make sure the message proxy’s service connection settings are correctly configured.
  • If you see any SSL-related events, it could be that the downstream service expects your message proxy to use a secure connection. The Virtualize server comes with a self-signed certificate. If this is insufficient, then you will need to have a valid certificate generated into a keystore for your Virtualize server so that the message proxy can be configured to make an SSL handshake to your downstream service with the appropriate certificate.

Learning Mode Quality Issues

Service virtualization setup with the learning mode message proxy is designed for simple correlation between requests & responses. If you find that the virtual service is failing over to the real service more than you think it should, or is returning stale data, then there is some customization available. When you download the virtual asset generated by learning mode into your Virtualize Desktop workspace, you can configure exclude patterns for request message correlation:

Screenshot of the learning mode data retention options.

There may be request parameters that are irrelevant to the recorded response that you would like to exclude from correlation, generalizing the cases when an incoming request correlates to a recorded response.

Or you may have requirements for the virtual service that are more sophisticated, for example:

  • More granular control over response data based on certain logic that correlates to incoming request parameters.
  • The desire for test data management tied to virtual services, with features like synthetic data generation, resetting, or subsetting.
  • CRUD use cases where you want responses to GET requests to be stateful based on handling POST/PUT/DELETE requests.
  • More granular control over response time performance characteristics.
  • Polling-style use cases where you want virtual responses for an identical request to follow a sequence pattern.

These are good examples where your needs exceed what the learning mode virtual service can provide. Learning mode is ideally suited for very fast setup where very little time is invested in creating and maintaining virtual services. It can provide a lot of value very quickly when the use case is fairly static. Advanced use cases like the ones above are easy to implement but require the virtual assets to be built in Virtualize Desktop and then deployed on the Virtualize server. Parasoft Virtualize allows for a lot of flexibility. You can even create assets that support these advanced use cases and then use the learning mode responses as a catch-all before finally failing over to the real service.