Chapter 76. Test scenarios (legacy) designer in Business Central

Red Hat Decision Manager currently supports both the new Test Scenarios designer and the former Test Scenarios (Legacy) designer. The default designer is the new test scenarios designer, which supports testing of both rules and DMN models and provides an enhanced overall user experience with test scenarios. If required, you can continue to use the legacy test scenarios designer, which supports rule-based test scenarios only.

76.1. Creating and running a test scenario (legacy)

You can create test scenarios in Business Central to test the functionality of business rule data before deployment. A basic test scenario must have at least the following data:

  • Related data objects
  • GIVEN facts
  • EXPECT results
Note

The legacy test scenarios designer supports the LocalDate java built-in data type. You can use the LocalDate java built-in data type in the dd-mmm-yyyy date format. For example, you can set this in the 17-Oct-2020 date format.

With this data, the test scenario can validate the expected and actual results for that rule instance based on the defined facts. You can also add a CALL METHOD and any available globals to a test scenario, but these scenario settings are optional.

Procedure

  1. In Business Central, go to MenuDesignProjects and click the project name.
  2. Click Add AssetTest Scenarios (Legacy).
  3. Enter an informative Test Scenario name and select the appropriate Package. The package that you specify must be the same package where the required rule assets have been assigned or will be assigned. You can import data objects from any package into the asset’s designer.
  4. Click Ok to create the test scenario.

    The new test scenario is now listed in the Test Scenarios panel of the Project Explorer,

  5. Click the Data Objects tab to verify that all data objects required for the rules that you want to test are listed. If not, click New item to import the needed data objects from other packages, or create data objects within your package.
  6. After all data objects are in place, return to the Model tab of the test scenarios designer and define the GIVEN and EXPECT data for the scenario, based on the available data objects.

    Figure 76.1. The test scenarios designer

    test scenario edit

    The GIVEN section defines the input facts for the test. For example, if an Underage rule in the project declines loan applications for applicants under the age of 21, then the GIVEN facts in the test scenario could be Applicant with age set to some integer less than 21.

    The EXPECT section defines the expected results based on the GIVEN input facts. That is, GIVEN the input facts, EXPECT these other facts to be valid or entire rules to be activated. For example, with the given facts of an applicant under the age of 21 in the scenario, the EXPECT results could be LoanApplication with approved set to false (as a result of the underage applicant), or could be the activation of the Underage rule as a whole.

  7. Optional: Add a CALL METHOD and any globals to the test scenario:

    • CALL METHOD: Use this to invoke a method from another fact when the rule execution is initiated. Click CALL METHOD, select a fact, and click 6187 to select the method to invoke. You can invoke any Java class methods (such as methods from an ArrayList) from the Java library or from a JAR that was imported for the project (if applicable).
    • globals: Use this to add any global variables in the project that you want to validate in the test scenario. Click globals to select the variable to be validated, and then in the test scenarios designer, click the global name and define field values to be applied to the global variable. If no global variables are available, then they must be created as new assets in Business Central. Global variables are named objects that are visible to the decision engine but are different from the objects for facts. Changes in the object of a global do not trigger the re-evaluation of rules.
  8. Click More at the bottom of the test scenarios designer to add other data blocks to the same scenario file as needed.
  9. After you have defined all GIVEN, EXPECT, and other data for the scenario, click Save in the test scenarios designer to save your work.
  10. Click Run scenario in the upper-right corner to run this .scenario file, or click Run all scenarios to run all saved .scenario files in the project package (if there are multiple). Although the Run scenario option does not require the individual .scenario file to be saved, the Run all scenarios option does require all .scenario files to be saved.

    If the test fails, address any problems described in the Alerts message at the bottom of the window, review all components in the scenario, and try again to validate the scenario until the scenario passes.

  11. Click Save in the test scenarios designer to save your work after all changes are complete.

76.1.1. Adding GIVEN facts in test scenarios (legacy)

The GIVEN section defines input facts for the test. For example, if an Underage rule in the project declines loan applications for applicants under the age of 21, then the GIVEN facts in the test scenario could be Applicant with age set to some integer less than 21.

Prerequisites

  • All data objects required for your test scenario have been created or imported and are listed in the Data Objects tab of the Test Scenarios (Legacy) designer.

Procedure

  1. In the Test Scenarios (Legacy) designer, click GIVEN to open the New input window with the available facts.

    Figure 76.2. Add GIVEN input to the test scenario

    Add GIVEN input to the test scenario

    The list includes the following options, depending on the data objects available in the Data Objects tab of the test scenarios designer:

    • Insert a new fact: Use this to add a fact and modify its field values. Enter a variable for the fact as the Fact name.
    • Modify an existing fact: (Appears only after another fact has been added.) Use this to specify a previously inserted fact to be modified in the decision engine between executions of the scenario.
    • Delete an existing fact: (Appears only after another fact has been added.) Use this to specify a previously inserted fact to be deleted from the decision engine between executions of the scenario.
    • Activate rule flow group: Use this to specify a rule flow group to be activated so that all rules within that group can be tested.
  2. Choose a fact for the desired input option and click Add. For example, set Insert a new fact: to Applicant and enter a or app or any other variable for the Fact name.
  3. Click the fact in the test scenarios designer and select the field to be modified.

    Figure 76.3. Modify a fact field

    Modifying a condition
  4. Click the edit icon ( 6191 ) and select from the following field values:

    • Literal value: Creates an open field in which you enter a specific literal value.
    • Bound variable: Sets the value of the field to the fact bound to a selected variable. The field type must match the bound variable type.
    • Create new fact: Enables you to create a new fact and assign it as a field value of the parent fact. Then you can click the child fact in the test scenarios designer and likewise assign field values or nest other facts similarly.
  5. Continue adding any other GIVEN input data for the scenario and click Save in the test scenarios designer to save your work.

76.1.2. Adding EXPECT results in test scenarios (legacy)

The EXPECT section defines the expected results based on the GIVEN input facts. That is, GIVEN the input facts, EXPECT other specified facts to be valid or entire rules to be activated. For example, with the given facts of an applicant under the age of 21 in the scenario, the EXPECT results could be LoanApplication with approved set to false (as a result of the underage applicant), or could be the activation of the Underage rule as a whole.

Prerequisites

  • All data objects required for your test scenario have been created or imported and are listed in the Data Objects tab of the Test Scenarios (Legacy) designer.

Procedure

  1. In the Test Scenarios (Legacy) designer, click EXPECT to open the New expectation window with the available facts.

    Figure 76.4. Add EXPECT results to the test scenario

    Add EXPECT results to the test scenario

    The list includes the following options, depending on the data in the GIVEN section and the data objects available in the Data Objects tab of the test scenarios designer:

    • Rule: Use this to specify a particular rule in the project that is expected to be activated as a result of the GIVEN input. Type the name of a rule that is expected to be activated or select it from the list of rules, and then in the test scenarios designer, specify the number of times the rule should be activated.
    • Fact value: Use this to select a fact and define values for it that are expected to be valid as a result of the facts defined in the GIVEN section. The facts are listed by the Fact name previously defined for the GIVEN input.
    • Any fact that matches: Use this to validate that at least one fact with the specified values exists as a result of the GIVEN input.
  2. Choose a fact for the desired expectation (such as Fact value: application) and click Add or OK.
  3. Click the fact in the test scenarios designer and select the field to be added and modified.

    Figure 76.5. Modify a fact field

    Modify a fact field
  4. Set the field values to what is expected to be valid as a result of the GIVEN input (such as approved | equals | false).

    Note

    In the legacy test scenarios designer, you can use ["value1", "value2"] string format in the EXPECT field to validate the list of strings.

  5. Continue adding any other EXPECT input data for the scenario and click Save in the test scenarios designer to save your work.
  6. After you have defined and saved all GIVEN, EXPECT, and other data for the scenario, click Run scenario in the upper-right corner to run this .scenario file, or click Run all scenarios to run all saved .scenario files in the project package (if there are multiple). Although the Run scenario option does not require the individual .scenario file to be saved, the Run all scenarios option does require all .scenario files to be saved.

    If the test fails, address any problems described in the Alerts message at the bottom of the window, review all components in the scenario, and try again to validate the scenario until the scenario passes.

  7. Click Save in the test scenarios designer to save your work after all changes are complete.