API Testing in OpenShift Pipelines with Newman

If you are writing REST based API applications you probably have some familiarity with the tool Postman which allows you to test your APIs via an interactive GUI. However did you know that there is a CLI equivalent of Postman called Newman that works with your existing Postman collections? Newman enables you to re-use your existing collections to integrate API testing into automated processes where a GUI would not be appropriate. While we will not go into the details of Postman or Newman here if you are new to the tools you can check out this blog which provides a good overview of both.

Integrating Newman into OpenShift Pipelines, aka Tekton, is very easy and straightforward. In this blog we are going to look at how I am using it in my product catalog demo to test the back-end API built in Quarkus as part of the CI/CD process powered by OpenShift Pipelines. This CI/CD process is shown in the diagram below (click for a bigger version) and note the two tasks where we do our API testing in the Development and Test environments, dev-test and test-test (unfortunate name) respectively. These tests are run after the new image is built and deployed in each environment and are thus considered integration tests rather then unit tests.

Product Catalog Server CICD

One of the things I love about Tekton, and thus OpenShift Pipelines, is the extensibility, it’s very easy to extend by creating custom images using either an existing image or an image that you have created yourself. If you are not familiar with OpenShift Pipelines or Tekton I would highly recommend checking out the concepts documentation which provides a good overview.

The first step to using Newman in OpenShift Pipelines is to create a custom task for it. Tasks in Tekton represent a sequence of steps to accomplish a specific goal or as the name implies, task. Each step uses the specified container image to perform it’s function. Fortunately in our case there is an existing container image for newman that we can leverage without having to create our own at docker.io/postman/newman. Our task definition for the newman task appears below:

apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
  name: newman
spec:
  params:
  - name: COLLECTION
    description: The collection to run, typically a remote URL pointing to the collection
    type: string
  - name: ENVIRONMENT
    description: The environment file to use from the newman-env configmap
    default: "newman-env.json"
  steps:
    - name: collections-test
      image: docker.io/postman/newman:latest
      command:
        - newman
      args:
        - run
        - $(inputs.params.COLLECTION)
        - -e
        - /config/$(inputs.params.ENVIRONMENT)
        - --bail
      volumeMounts:
        - name: newman-env
          mountPath: /config
  volumes:
    - name: newman-env
      configMap:
        name: newman-env

There are two parameters declared as part of this task, COLLECTION and ENVIRONMENT. The collection parameter references a URL to the test suite that you want to run, it’s typically created using the Postman GUI and exported as a JSON file. For the pipeline in the product catalog we use this product-catalog-server-tests.json. Each test in the collection represents a request/response to the API along with some simple tests to ensure conformance with the desired results.

For example, when requesting a list of products, we test that the response code was 200 and 12 products were returned as per the picture below:

Postman

Postman

The environment parameter is a configmap with the customization the test suite requires for the specific environment that is being tested. For example, the API for the development and test environments have different URLs so we need to parametize this so we can re-use the same test suite across all environments. You can see the environments for the dev and test in my github repo. The task is designed so that the configmap, newman-env, contains all of the environments as separate files within the configmap as per the example here:

apiVersion: v1
data:
  newman-dev-env.json: "{\n\t\"id\": \"30c331d4-e961-4606-aecb-5a60e8e15213\",\n\t\"name\": \"product-catalog-dev-service\",\n\t\"values\": [\n\t\t{\n\t\t\t\"key\": \"host\",\n\t\t\t\"value\": \"server.product-catalog-dev:8080\",\n\t\t\t\"enabled\": true\n\t\t},\n\t\t{\n\t\t\t\"key\": \"scheme\",\n\t\t\t\"value\": \"http\",\n\t\t\t\"enabled\": true\n\t\t}\n\t],\n\t\"_postman_variable_scope\": \"environment\"\n}"
  newman-test-env.json: "{\n\t\"id\": \"30c331d4-e961-4606-aecb-5a60e8e15213\",\n\t\"name\": \"product-catalog-dev-service\",\n\t\"values\": [\n\t\t{\n\t\t\t\"key\": \"host\",\n\t\t\t\"value\": \"server.product-catalog-test:8080\",\n\t\t\t\"enabled\": true\n\t\t},\n\t\t{\n\t\t\t\"key\": \"scheme\",\n\t\t\t\"value\": \"http\",\n\t\t\t\"enabled\": true\n\t\t}\n\t],\n\t\"_postman_variable_scope\": \"environment\"\n}"
kind: ConfigMap
metadata:
  name: newman-env
  namespace: product-catalog-cicd

In the raw configmap the environments are hard to read due to formatting, however below is what the newman-dev-env.json looks like when formatted properly. Notice the route is pointing to the service in the product-catalog-dev namespace.

{
	"id": "30c331d4-e961-4606-aecb-5a60e8e15213",
	"name": "product-catalog-dev-service",
	"values": [
		{
			"key": "host",
			"value": "server.product-catalog-dev:8080",
			"enabled": true
		},
		{
			"key": "scheme",
			"value": "http",
			"enabled": true
		}
	],
	"_postman_variable_scope": "environment"
}

So now that we have our task, our test suite and our environments we need to add the task to the pipeline to test an environment. You can see the complete pipeline here, an excerpt showing the pipeline testing the dev environment appears below:

    - name: dev-test
      taskRef:
        name: newman
      runAfter:
        - deploy-dev
      params:
        - name: COLLECTION
          value: https://raw.githubusercontent.com/gnunn-gitops/product-catalog-server/master/tests/product-catalog-server-tests.json
        - name: ENVIRONMENT
          value: newman-dev-env.json

When you run the task newman will log the results of the tests and if any of the tests fail will return an error code which propagated up to the pipeline and cause the pipeline itself to fail. Here is the result from testing the Dev environment:

newman
Quarkus Product Catalog
→ Get Products
GET http://server.product-catalog-dev:8080/api/product [200 OK, 3.63KB, 442ms]
✓ response is ok
✓ data valid
→ Get Existing Product
GET http://server.product-catalog-dev:8080/api/product/1 [200 OK, 388B, 14ms]
✓ response is ok
✓ Data is correct
→ Get Missing Product
GET http://server.product-catalog-dev:8080/api/product/99 [404 Not Found, 115B, 18ms]
✓ response is missing
→ Login
POST http://server.product-catalog-dev:8080/api/auth [200 OK, 165B, 145ms]
→ Get Missing User
GET http://server.product-catalog-dev:8080/api/user/8 [404 Not Found, 111B, 12ms]
✓ Is status code 404
→ Get Existing User
GET http://server.product-catalog-dev:8080/api/user/1 [200 OK, 238B, 20ms]
✓ response is ok
✓ Data is correct
→ Get Categories
GET http://server.product-catalog-dev:8080/api/category [200 OK, 458B, 16ms]
✓ response is ok
✓ data valid
→ Get Existing Category
GET http://server.product-catalog-dev:8080/api/category/1 [200 OK, 192B, 9ms]
✓ response is ok
✓ Data is correct
→ Get Missing Category
GET http://server.product-catalog-dev:8080/api/category/99 [404 Not Found, 116B, 9ms]
✓ response is missing
┌─────────────────────────┬───────────────────┬───────────────────┐
│ │ executed │ failed │
├─────────────────────────┼───────────────────┼───────────────────┤
│ iterations │ 10 │
├─────────────────────────┼───────────────────┼───────────────────┤
│ requests │ 90 │
├─────────────────────────┼───────────────────┼───────────────────┤
│ test-scripts │ 80 │
├─────────────────────────┼───────────────────┼───────────────────┤
│ prerequest-scripts │ 00 │
├─────────────────────────┼───────────────────┼───────────────────┤
│ assertions │ 130 │
├─────────────────────────┴───────────────────┴───────────────────┤
│ total run duration: 883ms │
├─────────────────────────────────────────────────────────────────┤
│ total data received: 4.72KB (approx) │
├─────────────────────────────────────────────────────────────────┤
│ average response time: 76ms [min: 9ms, max: 442ms, s.d.: 135ms] │
└─────────────────────────────────────────────────────────────────┘

So to summarize integrating API testing with OpenShift Pipelines is very quick and easy. While in this example we showed the process using Newman other API testing tools can be integrated following a similar process.

Leave a Reply

Your email address will not be published. Required fields are marked *


*