How the API testing functionality within API Connect can help you across the different stages of API development.
This will guide you through using AutoTest Assist in development for rapid generation of requests to quickly identify unexpected behaviour whilst rapidly iterating on API development. Then we will show how as the API is promoted we can generate repeatable test cases from the API specification
Import the API into API Connect
Import the sample bookshop API into API Connect.
-
From the API Connect SaaS homepage select Create an API
-
Under Import select Existing API
-
Specify the bookshop sample URL to import and click Next
https://welcome.apiconnect.automation.ibm.com/labs/api-testing/bookshop.yaml
-
Click Edit API
-
You can now explore the API definition and when ready to publish, select the overflow menu in the top right
⋮and select Publish -
Enter a name for the product e.g
Bookshopand click Next -
Select Sandbox and click Next
-
Leave the default settings and click Publish
Using AutoTest Assist
In the development phase, AutoTest Assist can be used to fuzz test your APIs and quickly and easily generate unexpected or unwanted behavours without needing to build out specific test cases.
- Select Test an API from the home page.
- Add a test suite using the button in the header.
- Provide it a name and save
- Select the AutoTest Profiles Tab and click on the Add+ button
- Select the imported API from the Existing APIs
- An extension file will be generated for the API which on review you will see the details of the different objects.
- Click Next and provide a name e.g.
Bookshop Autotestand click Next. - Select Edit and you will be shown the configuration.

- Update the ServerURL to the development endpoint for the bookstore API:
https://sample-api.us-east-a.apiconnect.dev.automation.ibm.com/bookstore
-
Here you can also customise how long you want the test to run for, the maximum number of errors seen before the test should be stopped and if you want to prioritise particular types of operations for testing.
-
Select Save to save your changes and then Run Test. The test will now run, generating requests using Biased Random data to simulate more realistic requests based on the data models in your API.
-
Once the test is completed you will see a report detailing the endpoints tested and the responses received.

Here you can review the particular cases and click through to see the specific requests that were made and why they were deemed to be successful or failures. This can then be used to iterate on your API definition or implementation as you find unexpected behaviours.
Once you have your API working as you’d like you can then move on to use Smart Generation to generate specific test cases from your specification.
Generate test cases using Smart Generation
In this stage we will demonstrate how smart generation can generate a test case based on the OpenAPI specification.
- Select Test an API from the home page.
- Select the test suite

- Select the newly created test suite tile

- Add a test using the Blue +Add button and select From spec (build with smart generation)

- Select then imported Bookshop API from the list of Existing APIs
- Select one of the 200 GET paths from the list so that we can see how the smart generation also builds out the create and delete to ensure the GET is successful. Then select Next

- Once the test is generated, select Edit
- Review the generated test steps and note that there is a post request and a delete request added either side of the get you are testing to set up the data for the get.
- Update the global variables to point to the sample backend as follows:
configs:
globalVariables:
basePath: bookstore
domain: sample-api.us-east-a.apiconnect.dev.automation.ibm.com
protocol: https://
- Select Run Test,then in the prompt select Save and Run
- You should then be shown the report for the test run - to view the details of the requests and the assertion results click on default on the left.
