savvytest User's Guide and Reference STUG0300-EN

Quick Start

Basic Background Knowledge

In order to be able to use savvytest efficiently, we recommend to become familiar with the basic principles and some elementary terms used in savvytest.

Interface

Using savvytest, you always test an interface. The technical description of an interface consists of the called program and the data structures passed to it (also called the signature). As an example, in COBOL this may look like this:


CALL PROGRAM1 USING PARM-1 PARM-2


In savvytest, the called program whose interface we want to test is called the target.


The data structures passed in the above USING clause are usually copybooks. In order to use them in savvytest, they have to be imported. During the import, they are translated to a language independent XML format simply called structure in savvytest. This is done because with savvytest, you can create and perform tests irrespective of the language the target is written in.


When you have technically described your interface by the target and the structures it uses, it is still missing the information which of the structure are used for input and which for output. This is because their usage is not contained in the structures nor does the call statement imply this, but this is merely a convention between calling program and called program. In order to simplify the later use of the interface, savvytest provides a way to add this semantic information to the interface definition.


In a nutshell, the savvytest test interface is a technical description (call interface) plus semantic information (usage). This provides a business view on the interface because it eventually masks out the technical details which are not relevant for the business functionality.




In summary, we need


With this information, savvytest knows how to (technically) call the target and how the data are (substantially) used by the target.

Test Scenario

A test scenario basically describes what to test, which input to use, and which output to expect in order to consider the test successful. In terms of savvytest we need to provide the following information in order to create a test case for a test scenario:





Please note that check conditions are essential. Not only are they required to document the test scenario properly, but the are crucial for the re-use of the test scenario. Without check conditions, you would be unable to automate the tests. Even worse, you would never be able to re-use the scenario for regression testing.


A test scenario is aimed to be a reproducible test of one or more specific requirements implemented by an interface.

Test Case

We speak of a test case each time an interface is called during the test run. The simplest variant of a test scenario contains just one test case, and it is thus referred as simple test scenario.


However, it is often required to create test scenarios that consist of more than one test case. There may be correlations that need to be reproduced in the test scenario. As an example, you may want to process contract data, but the function requests a contract ID that you must first read using another function. Or you may want to store certain data in a data base in order to provide a reproducible data environment for your actual test interface and become independent from the environment.


In order to accomplish such tasks, savvytest allows you combine multiple test cases in one test scenario, and to pass data from one test case to another by using references.




As a rule, a test scenario consists of any of those test cases required to check on or more requirements of one specific interface. The test cases belong to this test scenario in the specific order.


Please note that a test scenario should not test multiple interfaces, but use the additional test cases only for so-called setup and teardown purposes to support the test of one interface.

Test Suite

Opposite to the close relation of the test cases within a test scenario, a test suite is just a loose bundle of test scenarios. The test scenarios in a test suite are not related to each other and may even be executed in any order without interfering with each other.


The test scenarios are bundled in a test suite mostly for practical reasons. For example, you may have a collection of test scenarios that you want to run regularly as a regression test.

Test Run

When you run a test scenario or a test suite, all required information and data are collected from the test scenario and the test interfaces it uses. Next, these data are converted into a format required by target platform, copied over to there, and then the tests are executed on the target platform. After that, the result data are converted back, parsed and the test results are displayed in a result report.



Creating And Preparing A Test Project

Select Perspective

While using savvytest, you should select the savvytest perspective which provides an optimum collection of views. Select from the menu Window > Open Perspective > Other..., then select savvytest from the list and click OK. Once you have opened the savvytest perspective, you can quickly switch to other perspectives and back to savvytest by using the buttons in the upper right area of the toolbar.

Create New savvytest Project

In order to start with a brand-new project, select from the menu File > New > savvytest Project. Alternatively, just click on the savvytest symbol with the plus sign available in the toolbar. The New savvytest Project wizard will appear. Simply enter a project name and click Finish.


If you already have a savvytest project that you want to import into your workspace, you can simply use the standard project import mechanism. From the menu, select File > Import, then in the list select General > Existing Projects into Workspace. Once imported, savvytest will recognize the project as a savvytest project.

Create JCL Template

In order to run tests, savvytest requires at least one JCL template in the jcl folder of the project to be used for execution on the target platform. A sample JCL can be found in the savvytest Installation and Customization Guide[1]. It usually requires some customization to meet the requirements of your specific test environment.

Adjust Project Properties

Right-click on your project's name in the Project Explorer to the left, then, from the context menu, select Properties. From the list shown, select savvytest. Usually, you do not need to make any modifications in the general savvytest entry. Please see the Reference part for a description of the options there.


If you expand the savvytest entry on the left, you will see two more entries: COBOL Copybook Importer, and Target Environments.

COBOL Copybook Importer

You do not need to make any modifications in the COBOL Copybook Importer section unless your copybooks contain special characters that need to be replaced or removed. If so, click Add to add one or more string replacements. In order to remove strings, just leave the replacement entry empty.

Target Environments

Before you can run a test, you need to configure the Target Environment to run the test in. After you have selected this entry, you will see one or more Buttons on the right saying Add plus a connection type suitable for your environment. Click on the appropriate button, then enter a Name and a Description for the target environment. Next, select the remote system or server connection.


Only if you use the FTP connector, you will have to enter the connection data. Otherwise, the connection data will be taken from your development environment.


Finally, check the MVS dataset name patterns, but it is usually not necessary to change them.

You are now ready to go.

Four Steps For Testing

There are for basic steps which you need to know in order to create and run tests with savvytest. This chapter provides an overview of them to allow you to get started quickly. Please see the Reference part of this document, if and when you want to know more details.


At this point, it is presumed that you have already created and configured a savvytest project as described in the chapter before. Additionally, we need access to the copybooks that we are going to use – either on a local filesystem or on a remote system.


The four basic steps for creating and running a test are:




Step 1: Import Structures

The structures used need to be converted to the language independent format savvytest uses. For COBOL copybooks, this is done by the COBOL Copybook Importer.

  1. Right-click in your savvytest Project, then from the context menu, select Import
    (alternatively, from the menu, select File > Import)
    
→ The Import Wizard opens

  2. Select savvytest / COBOL Copybook Importer, then click Next

  3. Click Select...
    → The file selection window opens

  4. Select one or more required copybook(s) and click Open
    (if necessary, select alternate character encoding)

  5. If necessary, tick Overwrite existing structures and click Finish


The select structures are now imported and can be used in savvytest interfaces. The structures are usually imported only once unless their sources change.

Step 2: Define Interface

The basic definition of an interface is the name of the program to be called (referred to as the test target) and the data structures passed to it as call arguments. This corresponds to the technical interface as specified in a CALL statement (program name and USING parameters).

The technical interface is completed by adding the type of usage of the structures and/or attributes within in them. The usage is usually a convention between caller and called program on which structures or attributes are used as input data or output data. Other types of usage are: fixed technical data, or no usage at all (for details see table below).

We recommend that you perform this step thoroughly, because it increases the efficiency of creating test scenarios.


  1. Right-click in your savvytest project, then from the context menu, select
    New ▹ savvytest Test Interface
    (alternatively, from the menu, select File ▹ New ▹ savvytest Test Interface
    The Interface Wizard opens

  2. Enter Name, Alias name, and Description, then click Next

  3. In Target name, enter the name of the program to be tested,
    then add the required call parameters for this program,
    then click Finish
    The Interface Editor opens with the new interface just created

  4. Select usage for structures and/or attributes

Usage

Meaning

Visibility in test scenario editor

in

Used as input data

In Input Data

out

Used as output data

In Simple Check Conditions

in/out

Used as input and output data as well

Both in Input Data and Simple Check Conditions

fix

Fixed (technical) data, contents set in the interface only

No

none

Not used and never filled with any data*

No

* Please see reference part for details on what this is used for

Step 3: Create Test Scenario

In a test scenario, you select the interface to be called in the test, the input data to be passed, and the check conditions required to decide if the test output conforms to the expected results. Check conditions are often also referred to as assertions.


  1. Right-click on the interface for which you want to create a test scenario,
    then in the context menu, select
    New ▹ savvytest Test Scenario
    (alternatively, from the menu, select File ▹ New ▹ savvytest Test Scenario
    The Test Scenario Wizard opens

  2. Enter Name, Alias name and Description, then click Next

  3. Select interface (if not selected already), then click Finish
    → The Test Scenario Editor opens with the new test scenario just created

  4. Enter any input data required for the test

  5. Select tabulator Simple Check Conditions and add the check conditions to verify the expected results

  6. Save the test scenario


If a test scenario requires more than one test case, we call it a complex test scenario. You can add additional test cases by clicking on the Add test case icon above the list of test cases on the left of the editor. This opens the Add Test Case Wizard which allows you to add test cases just the same way as described above. Please see reference part for more details on how to create complex test scenarios.

Step 4: Execute Test

For test execution, all required specifications are collected, and transferred to the target platform. Then a job will be created based on the selected JCL template and submitted to the target platform. After the execution of the job has finished, the result data are transferred back, get analyzed, and finally, a result report is created and displayed.

  1. Open test scenario, then click on the greenish Execute test scenario button located in the upper right corner of the editor
    (alternatively, right-click on the test scenario, then select
    Test Run test scenario from the context menu)
    The Run Test dialog opens

  2. Optionally, change the pre-selected Target Environment, JCL Template, and test options,
    then click
    OK
    The test will be run, then, after it has finished, the Result Report will be displayed



You can close the Run Test dialog by clicking on Run in Background. If you do, the dialog will be closed, but the test continues to run in background. Its status can be seen in the Progress view which is usually found in the lower middle area. Once the test execution has finished, you can open the result report from this view by clicking on the link provided there.


The Result Report offers two different views:


How To Reflect Interface Changes

During ongoing development, an interface may change – either due to changes in the structures it uses or changes in type or number of argument structures passed. Of course, this affects any test scenarios that have been created for this interface.


However, due to the intelligent automatic migration mechanism built into savvytest, there are just two steps required to reflect the interface changes in the associated test scenarios:

  1. Re-import changed structures
    Only if changes were made to a structure, a new import is required to announce the changes to savvytest.
    To do so, simply repeat step 1 Import Structures explained and tick Overwrite existing structures.

  2. Update test interface
    Open the affected test interface which will automatically trigger the intelligent migration. Changes in structures will be recognized and migrated.
    If new structures or attributes were added, their usage and default values may then be changed, if necessary. To do so, please refer to step 2 Define Interface.
    Finally, save the updated interface.


After that, all test scenarios based on this interface will use the new interface when executed, migrating the test data accordingly.


Naturally, if the interface changes require modifications in the test scenarios because of additional input data or new check conditions, these modifications have to be made manually.


10