9.2. Creating a Test Scenario
Creating a Test Scenario requires you to provide data for conditions which resemble an instance of your fact or project model. This is matched against a given set of rules and if the expected results are matched against the actual results, the Test Scenario is deemed to have passed.
Procedure: Creating a new Test Scenario
- Open the Projects view from the Project Authoring menu.
- Select the project where the test scenario is to be created.
- From the New Item dropdown menu on the toolbar, select Test Scenario from the listed options.
- Enter the Test Scenario name in the pop-up dialog box and click OK.
You will be presented with the Test Scenario edit screen.
Procedure: Importing a model for the Test Scenario
- Use the tabs at the bottom of the screen to move between the Test Scenario edit screen and the Config edit screen.
- The Config screen allows you to specify the model objects that you will be using in this Test Scenario.
-
Data objects from the same package are available by default. For example, if you have a package structure
org.company.project, and you have a data object Fact1 in packageorg.companyand a Fact2 in packageorg.company.project, you will not have to import model objects for Fact1 if you want to create a Test Scenario in packageorg.company; however, you will need to import Fact2 if you want to use it. - To import the data objects that are required for your Scenario, click on the New Item button in the Config screen.
-
These imports can be specific to your project’s data model or generic ones like
StringorDoubleobjects.
Procedure: Providing Test Scenario Facts
- After you have imported the data objects, come back to the Test Scenario screen and enter the variables for your Scenario.
At the minimum, there are two sections that require input: GIVEN and EXPECT
- GIVEN: What are the input facts for this Test Scenario?
- EXPECT: What are the expected results given the input facts from the GIVEN section?
GIVENthese input parameters,EXPECTthese rules to be activated or fired. You can alsoEXPECTfacts to be present and to have specific field values orEXPECTrules not to fire at all.If the expectations are met, then the Test Scenario has passed and your rules have been created correctly. If the expectations are not met, then the Test Scenario has failed and you need to check your rules.
Procedure: Providing Given Facts
To add a new fact, click on the green + button next to the GIVEN label. This will bring up the New Input dialog box. Provide your fact data in this window based on the data models that you have imported in the Config screen.
You can select a particular data object from the model and give it a variable name (called Fact Name in the window), or choose to activate a rule flow group instead. If you choose to activate a rule flow group, you are allowing rules from a rule flow group to be tested by activating the group in advance. If you want to add a given fact and activate a rule flow group, you have to add the given fact, click the green + button again, and then add the activation.
Depending upon the model that you select, you will be able to provide values to its editable properties as part of your GIVEN fact. For example, if your model was a
Product, you might have properties likeitemID,productNameandprice. You get to these properties by clicking on the textInsert Product.
By clicking on the pencil icon next to the property, you can edit the property to provide either a literal value for that property that should form part of your GIVEN fact data or you can provide advanced fact data. See Section 9.3, “Additional Test Scenario Features” for more information.
Procedure: Providing Expected Rules
- Once you are satisfied with the Given rule conditions, you can expect that rules that will be fired if the Given rule conditions are met when the Test Scenario is run.
To do so, click on the green + button next to the EXPECT label. A New expectation dialog box will come up.
You can provide one of three expectations given the set of data that was created in the Given section. You can:
- Either type in the name of a rule that is expected to be fired or select it from the list of rules and then click the OK button.
-
Expect a particular instance of a model object (and one or more of its properties) to have a certain value by selecting that instance from the drop down in the Fact Value field. For example,
product1shown in the figure, which is an instance of the fictitiousProductmodel created in the Given section. You specify the property values by first adding that instance by clicking the Add button and then clicking a green arrow
to bring up the fields to add. Once you have selected the field to add, you can provide a literal value for that field.
Expect a fact model to match your Given facts by selecting it from the Any fact that matches drop down. Instances of data objects are in the working memory, rather than matching what has been set up in the Given section. Rules may change the contents of working memory for example, some facts may be retracted, others inserted, and some will have their properties changed.
In the figure shown above, you can mandate that instances of the Product’s one or more properties match the Given rule conditions.
Procedure: Reviewing, Saving, and Running a Scenario
- Once you are satisfied with your Test Scenario’s facts, you can save it by clicking the Save button in the upper right corner. Ensure you regularly save and review your scenarios.
You can now run your Test Scenario by clicking the Run scenario button at the top of the Test Scenario screen. The results are displayed at the bottom of this screen in a new panel called Reporting.
- Once you have a bunch of Test Scenarios for a particular package, you can run all of them together by accessing the All Test Scenarios tab and then clicking the Run all scenarios button.

Where did the comment section go?
Red Hat's documentation publication system recently went through an upgrade to enable speedier, more mobile-friendly content. We decided to re-evaluate our commenting platform to ensure that it meets your expectations and serves as an optimal feedback mechanism. During this redesign, we invite your input on providing feedback on Red Hat documentation via the discussion platform.