Chapter 18. Using the validation framework

Red Hat OpenStack Platform (RHOSP) includes a validation framework that you can use to verify the requirements and functionality of the undercloud and overcloud. The framework includes two types of validations:

  • Manual Ansible-based validations, which you execute through the validation command set.
  • Automatic in-flight validations, which execute during the deployment process.

You must understand which validations you want to run, and skip validations that are not relevant to your environment. For example, the pre-deployment validation includes a test for TLS-everywhere. If you do not intend to configure your environment for TLS-everywhere, this test fails. Use the --validation option in the validation run command to refine the validation according to your environment.

18.1. Ansible-based validations

During the installation of Red Hat OpenStack Platform (RHOSP) director, director also installs a set of playbooks from the openstack-tripleo-validations package. Each playbook contains tests for certain system requirements and a set of groups that define when to run the test:

no-op
Validations that run a no-op (no operation) task to verify to workflow functions correctly. These validations run on both the undercloud and overcloud.
prep
Validations that check the hardware configuration of the undercloud node. Run these validation before you run the openstack undercloud install command.
openshift-on-openstack
Validations that check that the environment meets the requirements to be able to deploy OpenShift on OpenStack.
pre-introspection
Validations to run before the nodes introspection using Ironic Inspector.
pre-deployment
Validations to run before the openstack overcloud deploy command.
post-deployment
Validations to run after the overcloud deployment has finished.
pre-upgrade
Validations to validate your RHOSP deployment before an upgrade.
post-upgrade
Validations to validate your RHOSP deployment after an upgrade.

18.2. Changing the validation configuration file

The validation configuration file is a .ini file that you can edit to control every aspect of the validation execution and the communication between remote machines.

You can change the default configuration values in one of the following ways:

  • Edit the default /etc/validations.cfg file.
  • Make your own copy of the default /etc/validations.cfg file, edit the copy, and provide it through the CLI with the --config argument. If you create your own copy of the configuration file, point the CLI to this file on each execution with --config.

By default, the location of the validation configuration file is /etc/validation.cfg.

Important

Ensure that you correctly edit the configuration file or your validation might fail with errors, for example:

  • undetected validations
  • callbacks written to different locations
  • incorrectly-parsed logs

Prerequisites

  • You have a thorough understanding of how to validate your environment.

Procedure

  1. Optional: Make a copy of the validation configuration file for editing:

    1. Copy /etc/validation.cfg to your home directory.
    2. Make the required edits to the new configuration file.
  2. Run the validation command:

    $ validation run --config <configuration-file>
    • Replace <configuration-file> with the file path to the configuration file that you want to use.

      Note

      When you run a validation, the Reasons column in the output is limited to 79 characters. To view the validation result in full, view the validation log files.

18.3. Listing validations

Run the validation list command to list the different types of validations available.

Procedure

  1. Source the stackrc file.

    $ source ~/stackrc
  2. Run the validation list command:

    • To list all validations, run the command without any options:

      $ validation list
    • To list validations in a group, run the command with the --group option:

      $ validation list --group prep
Note

For a full list of options, run validation list --help.

18.4. Running validations

To run a validation or validation group, use the validation run command. To see a full list of options, use the validation run --help command.

Note

When you run a validation, the Reasons column in the output is limited to 79 characters. To view the validation result in full, view the validation log files.

Procedure

  1. Source the stackrc file:

    $ source ~/stackrc
  2. Validate a static inventory file called tripleo-ansible-inventory.yaml.

    $ validation run --group pre-introspection -i tripleo-ansible-inventory.yaml
    Note

    You can find the inventory file in the ~/tripleo-deploy/<stack> directory for a standalone or undercloud deployment or in the ~/overcloud-deploy/<stack> directory for an overcloud deployment.

  3. Enter the validation run command:

    • To run a single validation, enter the command with the --validation option and the name of the validation. For example, to check the memory requirements of each node, enter --validation check-ram:

      $ validation run --validation check-ram

      To run multiple specific validations, use the --validation option with a comma-separated list of the validations that you want to run. For more information about viewing the list of available validations, see Listing validations.

    • To run all validations in a group, enter the command with the --group option:

      $ validation run --group prep

      To view detailed output from a specific validation, run the validation history get --full command against the UUID of the specific validation from the report:

      $ validation history get --full <UUID>

18.5. Creating a validation

You can create a validation with the validation init command. Execution of the command results in a basic template for a new validation. You can edit the new validation role to suit your requirements.

Important

Red Hat does not support user-created validations.

Prerequisites

  • You have a thorough understanding of how to validate your environment.
  • You have access rights to the directory where you run the command.

Procedure

  1. Create your validation:

    $ validation init <my-new-validation>
    • Replace <my-new-validation> with the name of your new validation.

      The execution of this command results in the creation of the following directory and sub-directories:

      /home/stack/community-validations
      ├── library
      ├── lookup_plugins
      ├── playbooks
      └── roles
      Note

      If you see the error message "The Community Validations are disabled by default, ensure that the enable_community_validations parameter is set to True in the validation configuration file. The default name and location of this file is /etc/validation.cfg.

  2. Edit the role to suit your requirements.

18.6. Viewing validation history

Director saves the results of each validation after you run a validation or group of validations. View past validation results with the validation history list command.

Prerequisites

  • You have run a validation or group of validations.

Procedure

  1. Log in to the undercloud host as the stack user.
  2. Source the stackrc file:

    $ source ~/stackrc
  3. You can view a list of all validations or the most recent validations:

    • View a list of all validations:

      $ validation history list
    • View history for a specific validation type by using the --validation option:

      $ validation history get --validation <validation-type>
      • Replace <validation-type> with the type of validation, for example, ntp.
  4. View the log for a specific validation UUID:

    $ validation show run --full 7380fed4-2ea1-44a1-ab71-aab561b44395

Additional resources

18.7. Validation framework log format

After you run a validation or group of validations, director saves a JSON-formatted log from each validation in the /var/logs/validations directory. You can view the file manually or use the validation history get --full command to display the log for a specific validation UUID.

Each validation log file follows a specific format:

  • <UUID>_<Name>_<Time>

    UUID
    The Ansible UUID for the validation.
    Name
    The Ansible name for the validation.
    Time
    The start date and time for when you ran the validation.

Each validation log contains three main parts:

plays

The plays section contains information about the tasks that the director performed as part of the validation:

play
A play is a group of tasks. Each play section contains information about that particular group of tasks, including the start and end times, the duration, the host groups for the play, and the validation ID and path.
tasks
The individual Ansible tasks that director runs to perform the validation. Each tasks section contains a hosts section, which contains the action that occurred on each individual host and the results from the execution of the actions. The tasks section also contains a task section, which contains the duration of the task.

stats

The stats section contains a basic summary of the outcome of all tasks on each host, such as the tasks that succeeded and failed.

validation_output

If any tasks failed or caused a warning message during a validation, the validation_output contains the output of that failure or warning.

18.8. Validation framework log output formats

The default behaviour of the validation framework is to save validation logs in JSON format. You can change the output of the logs with the ANSIBLE_STDOUT_CALLBACK environment variable.

To change the validation output log format, run a validation and include the --extra-env-vars ANSIBLE_STDOUT_CALLBACK=<callback> option:

$ validation run --extra-env-vars ANSIBLE_STDOUT_CALLBACK=<callback> --validation check-ram
  • Replace <callback> with an Ansible output callback. To view a list of the standard Ansible output callbacks, run the following command:
$ ansible-doc -t callback -l

The validation framework includes the following additional callbacks:

validation_json
The framework saves JSON-formatted validation results as a log file in /var/logs/validations. This is the default callback for the validation framework.
validation_stdout
The framework displays JSON-formatted validation results on screen.
http_json

The framework sends JSON-formatted validation results to an external logging server. You must also include additional environment variables for this callback:

HTTP_JSON_SERVER
The URL for the external server.
HTTP_JSON_PORT
The port for the API entry point of the external server. The default port in 8989.

Set these environment variables with additional --extra-env-vars options:

$ validation run --extra-env-vars ANSIBLE_STDOUT_CALLBACK=http_json \
    --extra-env-vars HTTP_JSON_SERVER=http://logserver.example.com \
    --extra-env-vars HTTP_JSON_PORT=8989 \
    --validation check-ram
Important

Before you use the http_json callback, you must add http_json to the callback_whitelist parameter in your ansible.cfg file:

callback_whitelist = http_json

18.9. In-flight validations

Red Hat OpenStack Platform (RHOSP) includes in-flight validations in the templates of composable services. In-flight validations verify the operational status of services at key steps of the overcloud deployment process.

In-flight validations run automatically as part of the deployment process. Some in-flight validations also use the roles from the openstack-tripleo-validations package.