ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmpdbh6f40u executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_create_thinp_then_remove_scsi_generated.yml ******************** 2 plays in /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove_scsi_generated.yml:3 Thursday 21 July 2022 14:49:47 +0000 (0:00:00.013) 0:00:00.013 ********* ok: [/cache/fedora-36.qcow2.snap] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove_scsi_generated.yml:7 Thursday 21 July 2022 14:49:48 +0000 (0:00:01.365) 0:00:01.378 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:2 Thursday 21 July 2022 14:49:48 +0000 (0:00:00.048) 0:00:01.427 ********* ok: [/cache/fedora-36.qcow2.snap] META: ran handlers TASK [include_role : linux-system-roles.storage] ******************************* task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:14 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.967) 0:00:02.395 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.034) 0:00:02.429 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:49:49 +0000 (0:00:00.030) 0:00:02.460 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:49:50 +0000 (0:00:00.572) 0:00:03.032 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:49:50 +0000 (0:00:00.053) 0:00:03.086 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:49:50 +0000 (0:00:00.030) 0:00:03.117 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:49:50 +0000 (0:00:00.030) 0:00:03.147 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:49:50 +0000 (0:00:00.047) 0:00:03.195 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:49:50 +0000 (0:00:00.018) 0:00:03.213 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:49:53 +0000 (0:00:02.452) 0:00:05.666 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:49:53 +0000 (0:00:00.063) 0:00:05.730 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:49:53 +0000 (0:00:00.034) 0:00:05.764 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:49:53 +0000 (0:00:00.825) 0:00:06.590 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:49:54 +0000 (0:00:00.075) 0:00:06.665 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:49:54 +0000 (0:00:00.041) 0:00:06.707 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:49:54 +0000 (0:00:00.037) 0:00:06.744 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:49:54 +0000 (0:00:00.043) 0:00:06.787 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:49:56 +0000 (0:00:01.901) 0:00:08.689 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "inactive", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:49:58 +0000 (0:00:02.193) 0:00:10.883 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:49:58 +0000 (0:00:00.055) 0:00:10.938 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:49:58 +0000 (0:00:00.022) 0:00:10.961 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:49:58 +0000 (0:00:00.584) 0:00:11.545 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:49:58 +0000 (0:00:00.035) 0:00:11.580 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:49:58 +0000 (0:00:00.020) 0:00:11.601 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.037) 0:00:11.638 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.035) 0:00:11.673 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.036) 0:00:11.710 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.035) 0:00:11.745 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.025) 0:00:11.771 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.035) 0:00:11.806 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.024) 0:00:11.831 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.562) 0:00:12.393 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:49:59 +0000 (0:00:00.023) 0:00:12.417 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:17 Thursday 21 July 2022 14:50:00 +0000 (0:00:00.981) 0:00:13.398 ********* included: /tmp/tmpmb3dyg70/tests/get_unused_disk.yml for /cache/fedora-36.qcow2.snap TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmpmb3dyg70/tests/get_unused_disk.yml:2 Thursday 21 July 2022 14:50:00 +0000 (0:00:00.037) 0:00:13.435 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "disks": [ "sda", "sdb", "sdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmpmb3dyg70/tests/get_unused_disk.yml:9 Thursday 21 July 2022 14:50:02 +0000 (0:00:01.565) 0:00:15.001 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmpmb3dyg70/tests/get_unused_disk.yml:14 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.038) 0:00:15.039 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/get_unused_disk.yml:19 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.036) 0:00:15.076 ********* ok: [/cache/fedora-36.qcow2.snap] => { "unused_disks": [ "sda", "sdb", "sdc" ] } TASK [Create a thinpool device] ************************************************ task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:21 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.036) 0:00:15.112 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.043) 0:00:15.156 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:02 +0000 (0:00:00.032) 0:00:15.189 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.559) 0:00:15.748 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.058) 0:00:15.807 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.030) 0:00:15.837 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.031) 0:00:15.868 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.045) 0:00:15.914 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:03 +0000 (0:00:00.021) 0:00:15.936 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:05 +0000 (0:00:01.867) 0:00:17.803 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "3g", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:05 +0000 (0:00:00.036) 0:00:17.840 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:05 +0000 (0:00:00.034) 0:00:17.874 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:07 +0000 (0:00:02.040) 0:00:19.915 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:07 +0000 (0:00:00.044) 0:00:19.960 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:50:07 +0000 (0:00:00.080) 0:00:20.040 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:50:07 +0000 (0:00:00.040) 0:00:20.081 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:50:07 +0000 (0:00:00.046) 0:00:20.127 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:09 +0000 (0:00:01.848) 0:00:21.975 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:11 +0000 (0:00:02.047) 0:00:24.023 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:11 +0000 (0:00:00.055) 0:00:24.079 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:11 +0000 (0:00:00.020) 0:00:24.100 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs", "btrfs-progs", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:50:16 +0000 (0:00:04.943) 0:00:29.043 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:16 +0000 (0:00:00.039) 0:00:29.082 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:50:16 +0000 (0:00:00.021) 0:00:29.104 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "lvm2", "e2fsprogs", "btrfs-progs", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:50:16 +0000 (0:00:00.039) 0:00:29.143 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:50:16 +0000 (0:00:00.038) 0:00:29.182 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:50:16 +0000 (0:00:00.036) 0:00:29.219 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:50:16 +0000 (0:00:00.038) 0:00:29.257 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:50:17 +0000 (0:00:01.078) 0:00:30.335 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:50:18 +0000 (0:00:00.752) 0:00:31.088 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:50:19 +0000 (0:00:00.828) 0:00:31.916 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:50:19 +0000 (0:00:00.447) 0:00:32.363 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:50:19 +0000 (0:00:00.032) 0:00:32.395 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:38 Thursday 21 July 2022 14:50:20 +0000 (0:00:00.994) 0:00:33.390 ********* included: /tmp/tmpmb3dyg70/tests/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:50:20 +0000 (0:00:00.042) 0:00:33.433 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:50:20 +0000 (0:00:00.050) 0:00:33.484 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:50:20 +0000 (0:00:00.035) 0:00:33.520 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "82I52n-I6WQ-XH4c-KoBP-aOXr-t5d0-RlMBbK" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "TLGIA9-8V7A-4pyu-lK2E-tPi9-sEse-k3nds6" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "APpUgy-DXrO-zI5B-bePe-qUlx-RxxN-IhE01f" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-34-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:50:21 +0000 (0:00:00.562) 0:00:34.082 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003379", "end": "2022-07-21 14:50:21.668479", "rc": 0, "start": "2022-07-21 14:50:21.665100" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:50:22 +0000 (0:00:00.547) 0:00:34.629 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.004274", "end": "2022-07-21 14:50:22.114106", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:50:22.109832" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:50:22 +0000 (0:00:00.445) 0:00:35.075 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:50:22 +0000 (0:00:00.059) 0:00:35.135 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:50:22 +0000 (0:00:00.067) 0:00:35.202 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:50:22 +0000 (0:00:00.091) 0:00:35.294 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:50:22 +0000 (0:00:00.055) 0:00:35.350 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:50:24 +0000 (0:00:01.290) 0:00:36.640 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.051) 0:00:36.692 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.050) 0:00:36.742 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.048) 0:00:36.791 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.040) 0:00:36.831 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.048) 0:00:36.880 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.023) 0:00:36.903 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.066) 0:00:36.969 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.043) 0:00:37.013 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.024) 0:00:37.038 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.023) 0:00:37.061 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.025) 0:00:37.086 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.026) 0:00:37.112 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.025) 0:00:37.138 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.026) 0:00:37.164 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.024) 0:00:37.188 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.037) 0:00:37.226 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.047) 0:00:37.274 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.048) 0:00:37.322 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.067) 0:00:37.389 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.030) 0:00:37.420 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.028) 0:00:37.448 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.046) 0:00:37.494 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:50:24 +0000 (0:00:00.050) 0:00:37.545 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv1&&segtype=thin", "vg1" ], "delta": "0:00:00.044296", "end": "2022-07-21 14:50:25.060259", "rc": 0, "start": "2022-07-21 14:50:25.015963" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.479) 0:00:38.024 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.060) 0:00:38.085 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.054) 0:00:38.140 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.043) 0:00:38.184 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.051) 0:00:38.235 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.053) 0:00:38.289 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.036) 0:00:38.325 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.053) 0:00:38.378 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.051) 0:00:38.430 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.050) 0:00:38.480 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.039) 0:00:38.520 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.038) 0:00:38.558 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:50:25 +0000 (0:00:00.034) 0:00:38.593 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.033) 0:00:38.626 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.046) 0:00:38.672 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.048) 0:00:38.721 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.034) 0:00:38.755 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.034) 0:00:38.790 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.036) 0:00:38.826 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.078) 0:00:38.905 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.138) 0:00:39.043 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.051) 0:00:39.095 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.038) 0:00:39.133 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.038) 0:00:39.172 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.037) 0:00:39.209 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.037) 0:00:39.247 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.038) 0:00:39.285 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.058) 0:00:39.344 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.047) 0:00:39.391 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.023) 0:00:39.415 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.026) 0:00:39.441 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.024) 0:00:39.466 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.022) 0:00:39.488 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.022) 0:00:39.511 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.022) 0:00:39.534 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.026) 0:00:39.560 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:50:26 +0000 (0:00:00.033) 0:00:39.594 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.036) 0:00:39.630 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.047) 0:00:39.678 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.051) 0:00:39.730 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.082) 0:00:39.812 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.059) 0:00:39.872 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.059) 0:00:39.931 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.050) 0:00:39.982 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.049) 0:00:40.031 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.048) 0:00:40.080 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.023) 0:00:40.104 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.024) 0:00:40.128 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.023) 0:00:40.152 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.075) 0:00:40.227 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.144) 0:00:40.371 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.052) 0:00:40.424 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.049) 0:00:40.474 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.041) 0:00:40.515 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.035) 0:00:40.551 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:50:27 +0000 (0:00:00.052) 0:00:40.604 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.042) 0:00:40.646 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658415018.1222885, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415016.0012884, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1169, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415016.0012884, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.414) 0:00:41.061 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.041) 0:00:41.102 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.040) 0:00:41.143 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.036) 0:00:41.179 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.024) 0:00:41.203 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.039) 0:00:41.243 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:50:28 +0000 (0:00:00.023) 0:00:41.266 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:50:30 +0000 (0:00:01.857) 0:00:43.123 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.025) 0:00:43.149 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.026) 0:00:43.175 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.052) 0:00:43.228 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.025) 0:00:43.254 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.024) 0:00:43.279 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.023) 0:00:43.303 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.025) 0:00:43.328 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.024) 0:00:43.353 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.051) 0:00:43.404 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.090) 0:00:43.495 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:50:30 +0000 (0:00:00.041) 0:00:43.537 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.099) 0:00:43.636 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.045) 0:00:43.682 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.037) 0:00:43.719 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.041) 0:00:43.761 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.040) 0:00:43.802 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.041) 0:00:43.843 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.042) 0:00:43.885 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.038) 0:00:43.924 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.038) 0:00:43.962 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.040) 0:00:44.002 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:50:31 +0000 (0:00:00.522) 0:00:44.525 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.434) 0:00:44.960 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.052) 0:00:45.012 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.034) 0:00:45.047 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.037) 0:00:45.084 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.037) 0:00:45.122 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.036) 0:00:45.158 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.036) 0:00:45.194 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.038) 0:00:45.233 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.035) 0:00:45.268 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.038) 0:00:45.307 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:50:32 +0000 (0:00:00.060) 0:00:45.367 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.045672", "end": "2022-07-21 14:50:32.879215", "rc": 0, "start": "2022-07-21 14:50:32.833543" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=Vwi-aotz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.475) 0:00:45.843 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.055) 0:00:45.898 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.054) 0:00:45.953 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.041) 0:00:45.994 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.040) 0:00:46.034 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.041) 0:00:46.075 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.040) 0:00:46.116 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.035) 0:00:46.152 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.022) 0:00:46.174 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat previous invocation to verify idempotence] ************************ task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:40 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.034) 0:00:46.208 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.054) 0:00:46.263 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:50:33 +0000 (0:00:00.035) 0:00:46.298 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:50:34 +0000 (0:00:00.562) 0:00:46.861 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:50:34 +0000 (0:00:00.064) 0:00:46.926 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:50:34 +0000 (0:00:00.035) 0:00:46.962 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:50:34 +0000 (0:00:00.036) 0:00:46.999 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:50:34 +0000 (0:00:00.049) 0:00:47.048 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:50:34 +0000 (0:00:00.022) 0:00:47.070 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:50:36 +0000 (0:00:01.841) 0:00:48.912 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "3g", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:50:36 +0000 (0:00:00.037) 0:00:48.949 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:50:36 +0000 (0:00:00.067) 0:00:49.017 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:50:39 +0000 (0:00:03.030) 0:00:52.048 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.046) 0:00:52.094 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.048) 0:00:52.143 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.046) 0:00:52.189 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:50:39 +0000 (0:00:00.049) 0:00:52.239 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:50:41 +0000 (0:00:01.844) 0:00:54.084 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:50:43 +0000 (0:00:02.112) 0:00:56.196 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:50:43 +0000 (0:00:00.064) 0:00:56.261 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:50:43 +0000 (0:00:00.022) 0:00:56.284 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "dosfstools", "btrfs-progs", "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:50:46 +0000 (0:00:03.182) 0:00:59.466 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.039) 0:00:59.506 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.023) 0:00:59.529 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "dosfstools", "btrfs-progs", "lvm2", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:50:46 +0000 (0:00:00.052) 0:00:59.582 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.075) 0:00:59.657 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.115) 0:00:59.773 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.038) 0:00:59.812 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:50:47 +0000 (0:00:00.779) 0:01:00.591 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:50:48 +0000 (0:00:00.442) 0:01:01.033 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.778) 0:01:01.812 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.415) 0:01:02.228 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:50:49 +0000 (0:00:00.024) 0:01:02.252 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:56 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.994) 0:01:03.247 ********* included: /tmp/tmpmb3dyg70/tests/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.043) 0:01:03.291 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.051) 0:01:03.342 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:50:50 +0000 (0:00:00.038) 0:01:03.380 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "82I52n-I6WQ-XH4c-KoBP-aOXr-t5d0-RlMBbK" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "TLGIA9-8V7A-4pyu-lK2E-tPi9-sEse-k3nds6" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "APpUgy-DXrO-zI5B-bePe-qUlx-RxxN-IhE01f" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-34-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.496) 0:01:03.877 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003389", "end": "2022-07-21 14:50:51.331153", "rc": 0, "start": "2022-07-21 14:50:51.327764" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:50:51 +0000 (0:00:00.415) 0:01:04.293 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003282", "end": "2022-07-21 14:50:51.757517", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:50:51.754235" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.428) 0:01:04.721 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.058) 0:01:04.780 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.041) 0:01:04.821 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.049) 0:01:04.871 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:50:52 +0000 (0:00:00.057) 0:01:04.929 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:50:53 +0000 (0:00:01.184) 0:01:06.113 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.052) 0:01:06.166 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.057) 0:01:06.223 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.050) 0:01:06.273 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.038) 0:01:06.312 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.055) 0:01:06.367 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.026) 0:01:06.394 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.114) 0:01:06.508 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.044) 0:01:06.553 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.025) 0:01:06.578 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:50:53 +0000 (0:00:00.027) 0:01:06.605 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.062) 0:01:06.668 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.027) 0:01:06.695 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.026) 0:01:06.722 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.026) 0:01:06.749 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.025) 0:01:06.774 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.036) 0:01:06.810 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.046) 0:01:06.857 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.044) 0:01:06.902 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.030) 0:01:06.932 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.028) 0:01:06.961 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.028) 0:01:06.990 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.045) 0:01:07.035 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.042) 0:01:07.078 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv1&&segtype=thin", "vg1" ], "delta": "0:00:00.045223", "end": "2022-07-21 14:50:54.590448", "rc": 0, "start": "2022-07-21 14:50:54.545225" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.475) 0:01:07.554 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:50:54 +0000 (0:00:00.060) 0:01:07.614 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.055) 0:01:07.670 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.041) 0:01:07.712 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.049) 0:01:07.762 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.051) 0:01:07.813 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.033) 0:01:07.847 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.054) 0:01:07.902 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.048) 0:01:07.950 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.051) 0:01:08.001 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.038) 0:01:08.040 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.034) 0:01:08.074 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.034) 0:01:08.109 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.111) 0:01:08.221 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.065) 0:01:08.286 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.051) 0:01:08.338 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.039) 0:01:08.377 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.037) 0:01:08.415 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.036) 0:01:08.452 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.039) 0:01:08.491 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.050) 0:01:08.542 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:50:55 +0000 (0:00:00.048) 0:01:08.590 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.040) 0:01:08.631 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.037) 0:01:08.669 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.037) 0:01:08.706 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.037) 0:01:08.744 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.035) 0:01:08.779 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.051) 0:01:08.831 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.047) 0:01:08.878 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.024) 0:01:08.902 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.022) 0:01:08.925 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.022) 0:01:08.948 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.022) 0:01:08.970 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.025) 0:01:08.996 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.024) 0:01:09.020 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.024) 0:01:09.045 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.034) 0:01:09.080 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.036) 0:01:09.116 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.045) 0:01:09.161 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.052) 0:01:09.214 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.088) 0:01:09.302 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.094) 0:01:09.396 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.137) 0:01:09.534 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:50:56 +0000 (0:00:00.051) 0:01:09.585 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.052) 0:01:09.638 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.051) 0:01:09.689 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.027) 0:01:09.716 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.026) 0:01:09.743 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.028) 0:01:09.771 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.037) 0:01:09.809 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.064) 0:01:09.873 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.051) 0:01:09.924 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.055) 0:01:09.980 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.038) 0:01:10.018 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.046) 0:01:10.064 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.042) 0:01:10.106 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.041) 0:01:10.148 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658415018.1222885, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415016.0012884, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1169, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415016.0012884, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.421) 0:01:10.569 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:50:57 +0000 (0:00:00.041) 0:01:10.610 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.041) 0:01:10.652 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.038) 0:01:10.690 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.025) 0:01:10.716 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.040) 0:01:10.756 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:50:58 +0000 (0:00:00.025) 0:01:10.781 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:51:00 +0000 (0:00:01.912) 0:01:12.694 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.025) 0:01:12.719 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.025) 0:01:12.744 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.086) 0:01:12.830 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.027) 0:01:12.857 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.026) 0:01:12.884 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.026) 0:01:12.911 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.025) 0:01:12.937 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.027) 0:01:12.964 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.053) 0:01:13.017 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.051) 0:01:13.069 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.036) 0:01:13.105 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.034) 0:01:13.140 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.038) 0:01:13.178 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.033) 0:01:13.212 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.036) 0:01:13.248 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.039) 0:01:13.288 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.035) 0:01:13.323 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.037) 0:01:13.361 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.042) 0:01:13.404 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.042) 0:01:13.447 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:51:00 +0000 (0:00:00.037) 0:01:13.484 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.422) 0:01:13.906 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.425) 0:01:14.332 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.052) 0:01:14.384 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.039) 0:01:14.423 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.038) 0:01:14.462 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.037) 0:01:14.499 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.036) 0:01:14.536 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:51:01 +0000 (0:00:00.041) 0:01:14.578 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.078) 0:01:14.656 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.039) 0:01:14.696 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.038) 0:01:14.735 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.054) 0:01:14.789 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.043059", "end": "2022-07-21 14:51:02.283081", "rc": 0, "start": "2022-07-21 14:51:02.240022" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=Vwi-aotz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.456) 0:01:15.246 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.050) 0:01:15.296 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.051) 0:01:15.348 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.041) 0:01:15.390 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.039) 0:01:15.430 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.037) 0:01:15.467 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.039) 0:01:15.507 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.038) 0:01:15.546 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.023) 0:01:15.569 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Change thinlv fs type] *************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:58 Thursday 21 July 2022 14:51:02 +0000 (0:00:00.034) 0:01:15.603 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.054) 0:01:15.657 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.035) 0:01:15.693 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.557) 0:01:16.250 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.065) 0:01:16.316 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.034) 0:01:16.351 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.032) 0:01:16.383 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.047) 0:01:16.431 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:51:03 +0000 (0:00:00.061) 0:01:16.492 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:51:05 +0000 (0:00:01.778) 0:01:18.270 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "fs_type": "xfs", "name": "lv1", "thin": true, "thin_pool_name": "tpool1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:51:05 +0000 (0:00:00.052) 0:01:18.323 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:51:05 +0000 (0:00:00.038) 0:01:18.361 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2", "xfsprogs" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:51:08 +0000 (0:00:03.017) 0:01:21.379 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:51:08 +0000 (0:00:00.050) 0:01:21.430 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:51:08 +0000 (0:00:00.050) 0:01:21.480 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:51:08 +0000 (0:00:00.058) 0:01:21.539 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:51:08 +0000 (0:00:00.049) 0:01:21.589 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:51:10 +0000 (0:00:01.748) 0:01:23.337 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:51:12 +0000 (0:00:02.064) 0:01:25.401 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:51:12 +0000 (0:00:00.060) 0:01:25.462 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:51:12 +0000 (0:00:00.023) 0:01:25.485 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs", "lvm2", "btrfs-progs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:51:16 +0000 (0:00:03.189) 0:01:28.674 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.082) 0:01:28.757 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.023) 0:01:28.780 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "dosfstools", "e2fsprogs", "xfsprogs", "lvm2", "btrfs-progs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.038) 0:01:28.818 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.039) 0:01:28.858 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.036) 0:01:28.895 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'path': '/opt/test1', 'state': 'absent'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:51:16 +0000 (0:00:00.461) 0:01:29.357 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.789) 0:01:30.146 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:51:17 +0000 (0:00:00.042) 0:01:30.188 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.776) 0:01:30.965 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.430) 0:01:31.396 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:51:18 +0000 (0:00:00.025) 0:01:31.421 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:72 Thursday 21 July 2022 14:51:19 +0000 (0:00:00.986) 0:01:32.408 ********* included: /tmp/tmpmb3dyg70/tests/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:51:19 +0000 (0:00:00.078) 0:01:32.486 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:51:19 +0000 (0:00:00.051) 0:01:32.538 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:51:19 +0000 (0:00:00.036) 0:01:32.574 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "82I52n-I6WQ-XH4c-KoBP-aOXr-t5d0-RlMBbK" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "TLGIA9-8V7A-4pyu-lK2E-tPi9-sEse-k3nds6" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "APpUgy-DXrO-zI5B-bePe-qUlx-RxxN-IhE01f" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-34-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:51:20 +0000 (0:00:00.430) 0:01:33.005 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004563", "end": "2022-07-21 14:51:21.470175", "rc": 0, "start": "2022-07-21 14:51:20.465612" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:51:21 +0000 (0:00:01.449) 0:01:34.455 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003250", "end": "2022-07-21 14:51:21.919216", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:51:21.915966" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:51:22 +0000 (0:00:00.427) 0:01:34.882 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:51:22 +0000 (0:00:00.064) 0:01:34.947 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:51:22 +0000 (0:00:00.036) 0:01:34.983 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:51:22 +0000 (0:00:00.048) 0:01:35.032 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:51:22 +0000 (0:00:00.055) 0:01:35.087 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:51:23 +0000 (0:00:01.253) 0:01:36.341 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.050) 0:01:36.392 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.099) 0:01:36.491 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.054) 0:01:36.545 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:51:23 +0000 (0:00:00.039) 0:01:36.585 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.089) 0:01:36.674 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.027) 0:01:36.702 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.111) 0:01:36.813 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.078) 0:01:36.892 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.024) 0:01:36.916 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.026) 0:01:36.943 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.026) 0:01:36.969 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.025) 0:01:36.994 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.024) 0:01:37.018 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.023) 0:01:37.042 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.024) 0:01:37.067 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.033) 0:01:37.100 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.044) 0:01:37.145 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.048) 0:01:37.193 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.027) 0:01:37.220 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.028) 0:01:37.249 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.030) 0:01:37.280 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.046) 0:01:37.326 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:51:24 +0000 (0:00:00.046) 0:01:37.372 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv1&&segtype=thin", "vg1" ], "delta": "0:00:00.049778", "end": "2022-07-21 14:51:24.886881", "rc": 0, "start": "2022-07-21 14:51:24.837103" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.484) 0:01:37.857 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.059) 0:01:37.916 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.055) 0:01:37.972 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.044) 0:01:38.017 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.047) 0:01:38.064 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.051) 0:01:38.116 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.033) 0:01:38.150 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.053) 0:01:38.203 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.051) 0:01:38.254 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.091) 0:01:38.345 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.039) 0:01:38.385 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.077) 0:01:38.462 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.038) 0:01:38.501 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.039) 0:01:38.540 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:51:25 +0000 (0:00:00.052) 0:01:38.592 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.055) 0:01:38.647 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.040) 0:01:38.688 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.040) 0:01:38.729 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.040) 0:01:38.770 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.038) 0:01:38.808 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.052) 0:01:38.860 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.049) 0:01:38.910 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.036) 0:01:38.947 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.040) 0:01:38.987 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.035) 0:01:39.023 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.037) 0:01:39.060 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.036) 0:01:39.097 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.051) 0:01:39.148 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.049) 0:01:39.197 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.025) 0:01:39.223 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.025) 0:01:39.248 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.023) 0:01:39.272 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.023) 0:01:39.296 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.023) 0:01:39.320 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.024) 0:01:39.345 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.026) 0:01:39.371 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.034) 0:01:39.406 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.035) 0:01:39.441 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.046) 0:01:39.488 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.048) 0:01:39.536 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:51:26 +0000 (0:00:00.076) 0:01:39.612 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.113) 0:01:39.726 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.053) 0:01:39.779 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.023) 0:01:39.803 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.058) 0:01:39.861 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.037) 0:01:39.899 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.029) 0:01:39.929 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.027) 0:01:39.957 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.028) 0:01:39.985 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.034) 0:01:40.019 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.059) 0:01:40.079 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.048) 0:01:40.128 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.048) 0:01:40.176 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.035) 0:01:40.211 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.035) 0:01:40.247 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.037) 0:01:40.284 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:51:27 +0000 (0:00:00.044) 0:01:40.328 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658415018.1222885, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415016.0012884, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1169, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415016.0012884, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:51:28 +0000 (0:00:00.427) 0:01:40.756 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:51:28 +0000 (0:00:00.040) 0:01:40.796 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:51:28 +0000 (0:00:00.040) 0:01:40.837 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:51:28 +0000 (0:00:00.037) 0:01:40.874 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:51:28 +0000 (0:00:00.025) 0:01:40.900 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:51:28 +0000 (0:00:00.041) 0:01:40.941 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:51:28 +0000 (0:00:00.024) 0:01:40.966 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:51:30 +0000 (0:00:01.960) 0:01:42.927 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.025) 0:01:42.952 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.023) 0:01:42.976 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.086) 0:01:43.063 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.026) 0:01:43.090 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.025) 0:01:43.115 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.025) 0:01:43.141 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.025) 0:01:43.166 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.053) 0:01:43.220 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.052) 0:01:43.273 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.052) 0:01:43.325 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.042) 0:01:43.368 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.038) 0:01:43.407 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.037) 0:01:43.444 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.035) 0:01:43.479 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.034) 0:01:43.514 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.034) 0:01:43.548 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:51:30 +0000 (0:00:00.040) 0:01:43.589 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:51:31 +0000 (0:00:00.036) 0:01:43.625 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:51:31 +0000 (0:00:00.035) 0:01:43.661 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:51:31 +0000 (0:00:00.038) 0:01:43.700 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:51:31 +0000 (0:00:00.035) 0:01:43.736 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:51:31 +0000 (0:00:00.411) 0:01:44.148 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:51:31 +0000 (0:00:00.427) 0:01:44.575 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.062) 0:01:44.638 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.035) 0:01:44.673 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.036) 0:01:44.709 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.037) 0:01:44.746 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.038) 0:01:44.785 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.036) 0:01:44.822 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.036) 0:01:44.859 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.034) 0:01:44.894 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.069) 0:01:44.964 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.145) 0:01:45.109 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.044688", "end": "2022-07-21 14:51:32.618174", "rc": 0, "start": "2022-07-21 14:51:32.573486" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=Vwi-a-tz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:51:32 +0000 (0:00:00.472) 0:01:45.582 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.050) 0:01:45.632 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.050) 0:01:45.683 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.037) 0:01:45.720 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.042) 0:01:45.763 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.037) 0:01:45.800 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.039) 0:01:45.840 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.036) 0:01:45.877 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.022) 0:01:45.900 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create new LV under existing thinpool] *********************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:74 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.033) 0:01:45.933 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.060) 0:01:45.994 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.037) 0:01:46.032 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:51:33 +0000 (0:00:00.569) 0:01:46.601 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:51:34 +0000 (0:00:00.062) 0:01:46.664 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:51:34 +0000 (0:00:00.035) 0:01:46.699 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:51:34 +0000 (0:00:00.033) 0:01:46.733 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:51:34 +0000 (0:00:00.047) 0:01:46.780 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:51:34 +0000 (0:00:00.020) 0:01:46.800 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:51:36 +0000 (0:00:01.885) 0:01:48.685 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test2", "name": "lv2", "size": "4g", "thin": true, "thin_pool_name": "tpool1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:51:36 +0000 (0:00:00.037) 0:01:48.723 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:51:36 +0000 (0:00:00.038) 0:01:48.761 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:51:39 +0000 (0:00:02.912) 0:01:51.674 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:51:39 +0000 (0:00:00.047) 0:01:51.721 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:51:39 +0000 (0:00:00.046) 0:01:51.767 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:51:39 +0000 (0:00:00.039) 0:01:51.807 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:51:39 +0000 (0:00:00.044) 0:01:51.851 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:51:40 +0000 (0:00:01.737) 0:01:53.589 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:51:43 +0000 (0:00:02.077) 0:01:55.666 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.097) 0:01:55.764 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:51:43 +0000 (0:00:00.061) 0:01:55.825 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" } ], "packages": [ "btrfs-progs", "e2fsprogs", "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:51:46 +0000 (0:00:03.530) 0:01:59.355 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.039) 0:01:59.395 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.023) 0:01:59.418 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" } ], "packages": [ "btrfs-progs", "e2fsprogs", "lvm2", "dosfstools", "xfsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.043) 0:01:59.462 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.041) 0:01:59.503 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.040) 0:01:59.543 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:51:46 +0000 (0:00:00.040) 0:01:59.584 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:51:47 +0000 (0:00:00.808) 0:02:00.392 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv2', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:51:48 +0000 (0:00:00.498) 0:02:00.890 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:51:49 +0000 (0:00:00.800) 0:02:01.691 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:51:49 +0000 (0:00:00.421) 0:02:02.112 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:51:49 +0000 (0:00:00.023) 0:02:02.136 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:89 Thursday 21 July 2022 14:51:50 +0000 (0:00:01.028) 0:02:03.164 ********* included: /tmp/tmpmb3dyg70/tests/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.050) 0:02:03.215 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.055) 0:02:03.270 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:51:50 +0000 (0:00:00.038) 0:02:03.308 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "4G", "type": "lvm", "uuid": "ea95a1d9-aa67-49a6-8d63-68fa554e728e" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "82I52n-I6WQ-XH4c-KoBP-aOXr-t5d0-RlMBbK" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "TLGIA9-8V7A-4pyu-lK2E-tPi9-sEse-k3nds6" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "APpUgy-DXrO-zI5B-bePe-qUlx-RxxN-IhE01f" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-34-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:51:51 +0000 (0:00:00.427) 0:02:03.736 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003770", "end": "2022-07-21 14:51:51.190881", "rc": 0, "start": "2022-07-21 14:51:51.187111" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:51:51 +0000 (0:00:00.418) 0:02:04.154 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003760", "end": "2022-07-21 14:51:51.611337", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:51:51.607577" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:51:51 +0000 (0:00:00.423) 0:02:04.577 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:51:52 +0000 (0:00:00.062) 0:02:04.640 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:51:52 +0000 (0:00:00.037) 0:02:04.677 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:51:52 +0000 (0:00:00.050) 0:02:04.728 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:51:52 +0000 (0:00:00.057) 0:02:04.785 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:51:53 +0000 (0:00:01.232) 0:02:06.018 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.055) 0:02:06.074 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.089) 0:02:06.163 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.054) 0:02:06.218 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.073) 0:02:06.291 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.084) 0:02:06.376 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.026) 0:02:06.403 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.143) 0:02:06.547 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:51:53 +0000 (0:00:00.045) 0:02:06.592 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.026) 0:02:06.619 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.026) 0:02:06.645 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.024) 0:02:06.670 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.025) 0:02:06.695 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.026) 0:02:06.722 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.024) 0:02:06.747 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.026) 0:02:06.773 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.035) 0:02:06.808 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.047) 0:02:06.856 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.049) 0:02:06.905 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.031) 0:02:06.936 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.030) 0:02:06.967 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.031) 0:02:06.998 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.048) 0:02:07.047 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.045) 0:02:07.092 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv2&&segtype=thin", "vg1" ], "delta": "0:00:00.050078", "end": "2022-07-21 14:51:54.607749", "rc": 0, "start": "2022-07-21 14:51:54.557671" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:51:54 +0000 (0:00:00.485) 0:02:07.577 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.060) 0:02:07.638 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.052) 0:02:07.691 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.040) 0:02:07.731 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.050) 0:02:07.782 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.141) 0:02:07.924 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.032) 0:02:07.957 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.052) 0:02:08.009 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.053) 0:02:08.063 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.055) 0:02:08.118 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.039) 0:02:08.158 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.039) 0:02:08.198 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.043) 0:02:08.241 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.037) 0:02:08.279 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.054) 0:02:08.333 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.051) 0:02:08.385 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.041) 0:02:08.426 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.040) 0:02:08.466 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.039) 0:02:08.505 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.039) 0:02:08.545 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:51:55 +0000 (0:00:00.053) 0:02:08.598 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.051) 0:02:08.650 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.039) 0:02:08.690 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.038) 0:02:08.728 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.040) 0:02:08.769 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.037) 0:02:08.807 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.040) 0:02:08.848 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.050) 0:02:08.898 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.047) 0:02:08.946 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.027) 0:02:08.974 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.028) 0:02:09.002 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.025) 0:02:09.028 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.026) 0:02:09.054 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.027) 0:02:09.081 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.062) 0:02:09.144 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.025) 0:02:09.169 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.035) 0:02:09.205 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.036) 0:02:09.242 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.043) 0:02:09.286 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.048) 0:02:09.334 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.079) 0:02:09.414 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.040) 0:02:09.455 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030395, "block_size": 4096, "block_total": 1046016, "block_used": 15621, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 4220497920, "size_total": 4284481536, "uuid": "ea95a1d9-aa67-49a6-8d63-68fa554e728e" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030395, "block_size": 4096, "block_total": 1046016, "block_used": 15621, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 4220497920, "size_total": 4284481536, "uuid": "ea95a1d9-aa67-49a6-8d63-68fa554e728e" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.056) 0:02:09.511 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.050) 0:02:09.562 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:51:56 +0000 (0:00:00.048) 0:02:09.610 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.047) 0:02:09.658 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.023) 0:02:09.682 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.026) 0:02:09.708 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.026) 0:02:09.735 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.034) 0:02:09.769 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.064) 0:02:09.834 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.052) 0:02:09.886 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.051) 0:02:09.937 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.044) 0:02:09.982 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.036) 0:02:10.019 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.041) 0:02:10.060 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.042) 0:02:10.103 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658415107.9212885, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658415106.2982883, "dev": 5, "device_type": 64773, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1808, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658415106.2982883, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:51:57 +0000 (0:00:00.420) 0:02:10.523 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:51:58 +0000 (0:00:00.109) 0:02:10.632 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:51:58 +0000 (0:00:00.041) 0:02:10.674 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:51:58 +0000 (0:00:00.039) 0:02:10.713 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:51:58 +0000 (0:00:00.026) 0:02:10.740 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:51:58 +0000 (0:00:00.039) 0:02:10.779 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:51:58 +0000 (0:00:00.026) 0:02:10.805 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:52:00 +0000 (0:00:01.865) 0:02:12.671 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.027) 0:02:12.698 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.026) 0:02:12.725 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.052) 0:02:12.778 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.025) 0:02:12.803 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.024) 0:02:12.827 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.024) 0:02:12.852 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.024) 0:02:12.876 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.025) 0:02:12.902 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.049) 0:02:12.951 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.047) 0:02:12.998 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.038) 0:02:13.036 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.039) 0:02:13.076 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.037) 0:02:13.114 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.046) 0:02:13.160 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.040) 0:02:13.201 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.036) 0:02:13.237 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.037) 0:02:13.274 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.072) 0:02:13.347 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.037) 0:02:13.384 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.035) 0:02:13.420 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:52:00 +0000 (0:00:00.035) 0:02:13.455 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.413) 0:02:13.868 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.417) 0:02:14.286 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.053) 0:02:14.339 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.038) 0:02:14.378 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.037) 0:02:14.415 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.037) 0:02:14.452 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.041) 0:02:14.493 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.036) 0:02:14.530 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.036) 0:02:14.566 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:52:01 +0000 (0:00:00.038) 0:02:14.604 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.035) 0:02:14.639 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.053) 0:02:14.692 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.049783", "end": "2022-07-21 14:52:02.223031", "rc": 0, "start": "2022-07-21 14:52:02.173248" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=Vwi-aotz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.502) 0:02:15.195 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.054) 0:02:15.249 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.089) 0:02:15.339 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.043) 0:02:15.382 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.041) 0:02:15.423 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.038) 0:02:15.462 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.042) 0:02:15.505 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.074) 0:02:15.579 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:52:02 +0000 (0:00:00.024) 0:02:15.604 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove existing LV under existing thinpool] ****************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:91 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.036) 0:02:15.640 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.066) 0:02:15.707 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.037) 0:02:15.745 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.587) 0:02:16.332 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.065) 0:02:16.397 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.038) 0:02:16.436 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.041) 0:02:16.478 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.052) 0:02:16.531 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:52:03 +0000 (0:00:00.024) 0:02:16.555 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:52:05 +0000 (0:00:01.859) 0:02:18.415 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test2", "name": "lv2", "state": "absent", "thin": true, "thin_pool_name": "tpool1" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:52:05 +0000 (0:00:00.040) 0:02:18.456 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:52:05 +0000 (0:00:00.038) 0:02:18.495 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:52:09 +0000 (0:00:03.158) 0:02:21.653 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.051) 0:02:21.705 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.049) 0:02:21.755 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.045) 0:02:21.801 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:52:09 +0000 (0:00:00.049) 0:02:21.851 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:52:11 +0000 (0:00:01.810) 0:02:23.661 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:52:13 +0000 (0:00:02.111) 0:02:25.773 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:52:13 +0000 (0:00:00.060) 0:02:25.833 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:52:13 +0000 (0:00:00.024) 0:02:25.857 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" } ], "packages": [ "e2fsprogs", "xfsprogs", "lvm2", "btrfs-progs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:52:17 +0000 (0:00:04.095) 0:02:29.953 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.039) 0:02:29.992 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.022) 0:02:30.014 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" } ], "packages": [ "e2fsprogs", "xfsprogs", "lvm2", "btrfs-progs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.042) 0:02:30.057 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.040) 0:02:30.098 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.036) 0:02:30.135 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv2', 'path': '/opt/test2', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:52:17 +0000 (0:00:00.454) 0:02:30.590 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:52:18 +0000 (0:00:00.832) 0:02:31.423 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:52:18 +0000 (0:00:00.040) 0:02:31.464 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:52:19 +0000 (0:00:00.792) 0:02:32.256 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:52:20 +0000 (0:00:00.427) 0:02:32.683 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:52:20 +0000 (0:00:00.023) 0:02:32.707 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:106 Thursday 21 July 2022 14:52:21 +0000 (0:00:01.068) 0:02:33.776 ********* included: /tmp/tmpmb3dyg70/tests/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.053) 0:02:33.829 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.055) 0:02:33.885 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.036) 0:02:33.921 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "f7b48208-c88c-497a-af6d-e29a13610bd7" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "82I52n-I6WQ-XH4c-KoBP-aOXr-t5d0-RlMBbK" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "TLGIA9-8V7A-4pyu-lK2E-tPi9-sEse-k3nds6" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "APpUgy-DXrO-zI5B-bePe-qUlx-RxxN-IhE01f" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-34-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:52:21 +0000 (0:00:00.428) 0:02:34.350 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003941", "end": "2022-07-21 14:52:21.817900", "rc": 0, "start": "2022-07-21 14:52:21.813959" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:52:22 +0000 (0:00:00.435) 0:02:34.786 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003675", "end": "2022-07-21 14:52:22.242843", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:52:22.239168" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:52:22 +0000 (0:00:00.423) 0:02:35.210 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:52:22 +0000 (0:00:00.063) 0:02:35.273 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:52:22 +0000 (0:00:00.040) 0:02:35.313 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:52:22 +0000 (0:00:00.049) 0:02:35.363 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:52:22 +0000 (0:00:00.076) 0:02:35.440 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:52:24 +0000 (0:00:01.199) 0:02:36.639 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.083) 0:02:36.723 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.053) 0:02:36.776 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.083) 0:02:36.860 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.078) 0:02:36.938 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.049) 0:02:36.988 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.025) 0:02:37.013 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.141) 0:02:37.154 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.043) 0:02:37.197 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.024) 0:02:37.222 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.027) 0:02:37.249 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.025) 0:02:37.274 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.024) 0:02:37.298 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.024) 0:02:37.323 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.026) 0:02:37.349 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.024) 0:02:37.374 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.036) 0:02:37.411 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.047) 0:02:37.459 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.045) 0:02:37.505 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.031) 0:02:37.536 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.027) 0:02:37.564 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:52:24 +0000 (0:00:00.029) 0:02:37.594 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.046) 0:02:37.640 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.047) 0:02:37.687 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.031) 0:02:37.719 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.030) 0:02:37.749 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.029) 0:02:37.779 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.030) 0:02:37.809 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.050) 0:02:37.859 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.051) 0:02:37.911 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.033) 0:02:37.944 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.055) 0:02:38.000 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.048) 0:02:38.048 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.051) 0:02:38.100 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.038) 0:02:38.138 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.034) 0:02:38.173 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.040) 0:02:38.213 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.071) 0:02:38.285 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.122) 0:02:38.408 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.050) 0:02:38.459 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.035) 0:02:38.494 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.035) 0:02:38.530 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.037) 0:02:38.568 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 14:52:25 +0000 (0:00:00.037) 0:02:38.605 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:6 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.055) 0:02:38.661 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:11 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.050) 0:02:38.711 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:17 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.038) 0:02:38.750 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:23 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.039) 0:02:38.789 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-crypttab.yml:29 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.039) 0:02:38.829 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.037) 0:02:38.866 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.051) 0:02:38.917 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.056) 0:02:38.974 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.051) 0:02:39.025 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.027) 0:02:39.053 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.027) 0:02:39.080 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.028) 0:02:39.109 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.027) 0:02:39.136 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.027) 0:02:39.164 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.027) 0:02:39.191 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.029) 0:02:39.221 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.035) 0:02:39.257 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.036) 0:02:39.293 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.043) 0:02:39.337 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.051) 0:02:39.388 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.083) 0:02:39.472 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:52:26 +0000 (0:00:00.046) 0:02:39.518 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.110) 0:02:39.629 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.026) 0:02:39.656 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.133) 0:02:39.790 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.037) 0:02:39.827 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.026) 0:02:39.853 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.028) 0:02:39.881 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.025) 0:02:39.907 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.034) 0:02:39.942 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.066) 0:02:40.008 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.025) 0:02:40.034 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.054) 0:02:40.088 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.037) 0:02:40.125 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.040) 0:02:40.165 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.024) 0:02:40.190 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:52:27 +0000 (0:00:00.022) 0:02:40.213 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:52:28 +0000 (0:00:00.405) 0:02:40.619 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:52:28 +0000 (0:00:00.039) 0:02:40.658 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:52:28 +0000 (0:00:00.023) 0:02:40.682 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:52:28 +0000 (0:00:00.043) 0:02:40.726 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:52:28 +0000 (0:00:00.028) 0:02:40.754 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:52:28 +0000 (0:00:00.025) 0:02:40.780 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:52:28 +0000 (0:00:00.026) 0:02:40.806 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:52:30 +0000 (0:00:01.926) 0:02:42.733 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.025) 0:02:42.758 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.024) 0:02:42.783 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.024) 0:02:42.807 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.023) 0:02:42.831 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.024) 0:02:42.855 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.024) 0:02:42.880 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.025) 0:02:42.905 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.023) 0:02:42.929 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.091) 0:02:43.021 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.052) 0:02:43.073 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.038) 0:02:43.111 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.037) 0:02:43.149 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.073) 0:02:43.222 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.036) 0:02:43.259 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.038) 0:02:43.297 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.041) 0:02:43.338 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.036) 0:02:43.375 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.036) 0:02:43.411 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.037) 0:02:43.449 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.036) 0:02:43.486 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.036) 0:02:43.522 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.024) 0:02:43.547 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:52:30 +0000 (0:00:00.042) 0:02:43.590 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.040) 0:02:43.631 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.043) 0:02:43.674 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.043) 0:02:43.718 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.040) 0:02:43.758 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.038) 0:02:43.797 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.042) 0:02:43.839 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.038) 0:02:43.878 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.036) 0:02:43.915 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.038) 0:02:43.953 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.027) 0:02:43.981 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.026) 0:02:44.008 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.029) 0:02:44.037 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.026) 0:02:44.064 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.026) 0:02:44.090 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.026) 0:02:44.117 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.026) 0:02:44.144 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.024) 0:02:44.168 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.035) 0:02:44.204 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.024) 0:02:44.228 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Cleanup] ***************************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:108 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.039) 0:02:44.267 ********* TASK [linux-system-roles.storage : set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:2 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.073) 0:02:44.341 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : Ensure ansible_facts used by role] ********** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2 Thursday 21 July 2022 14:52:31 +0000 (0:00:00.037) 0:02:44.378 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [linux-system-roles.storage : Set platform/version specific variables] **** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.605) 0:02:44.984 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:5 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.067) 0:02:45.052 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:9 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.036) 0:02:45.088 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : include the appropriate provider tasks] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main.yml:13 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.037) 0:02:45.126 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.050) 0:02:45.177 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure blivet is available] ************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 14:52:32 +0000 (0:00:00.022) 0:02:45.200 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : show storage_pools] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14 Thursday 21 July 2022 14:52:34 +0000 (0:00:01.807) 0:02:47.008 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "3g", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g" } ] } ] } TASK [linux-system-roles.storage : show storage_volumes] *********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19 Thursday 21 July 2022 14:52:34 +0000 (0:00:00.041) 0:02:47.049 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [linux-system-roles.storage : get required packages] ********************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 Thursday 21 July 2022 14:52:34 +0000 (0:00:00.036) 0:02:47.085 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [linux-system-roles.storage : enable copr repositories if needed] ********* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37 Thursday 21 July 2022 14:52:45 +0000 (0:00:11.356) 0:02:58.441 ********* included: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [linux-system-roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 14:52:45 +0000 (0:00:00.054) 0:02:58.496 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 14:52:45 +0000 (0:00:00.052) 0:02:58.549 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : enable COPRs] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 14:52:45 +0000 (0:00:00.040) 0:02:58.589 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : make sure required packages are installed] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 Thursday 21 July 2022 14:52:46 +0000 (0:00:00.046) 0:02:58.636 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [linux-system-roles.storage : get service facts] ************************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 Thursday 21 July 2022 14:52:47 +0000 (0:00:01.869) 0:03:00.505 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************ task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 14:52:50 +0000 (0:00:02.150) 0:03:02.655 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] ******* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71 Thursday 21 July 2022 14:52:50 +0000 (0:00:00.061) 0:03:02.717 ********* TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 Thursday 21 July 2022 14:52:50 +0000 (0:00:00.024) 0:03:02.742 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [], "packages": [ "btrfs-progs", "e2fsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91 Thursday 21 July 2022 14:52:57 +0000 (0:00:07.229) 0:03:09.971 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] ***** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.040) 0:03:10.012 ********* TASK [linux-system-roles.storage : show blivet_output] ************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.023) 0:03:10.035 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [], "packages": [ "btrfs-progs", "e2fsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [linux-system-roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.049) 0:03:10.085 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [linux-system-roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.051) 0:03:10.137 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [linux-system-roles.storage : remove obsolete mounts] ********************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.041) 0:03:10.178 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.037) 0:03:10.215 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : set up new/current mounts] ****************** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.024) 0:03:10.240 ********* TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.038) 0:03:10.279 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171 Thursday 21 July 2022 14:52:57 +0000 (0:00:00.025) 0:03:10.304 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176 Thursday 21 July 2022 14:52:58 +0000 (0:00:00.418) 0:03:10.723 ********* TASK [linux-system-roles.storage : Update facts] ******************************* task path: /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 Thursday 21 July 2022 14:52:58 +0000 (0:00:00.024) 0:03:10.747 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/tests_create_thinp_then_remove.yml:125 Thursday 21 July 2022 14:52:59 +0000 (0:00:01.027) 0:03:11.774 ********* included: /tmp/tmpmb3dyg70/tests/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:1 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.081) 0:03:11.856 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:6 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.052) 0:03:11.909 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:14 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.037) 0:03:11.946 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-14-49-34-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:19 Thursday 21 July 2022 14:52:59 +0000 (0:00:00.438) 0:03:12.385 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003419", "end": "2022-07-21 14:52:59.852472", "rc": 0, "start": "2022-07-21 14:52:59.849053" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:24 Thursday 21 July 2022 14:53:00 +0000 (0:00:00.433) 0:03:12.819 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.005492", "end": "2022-07-21 14:53:01.275570", "failed_when_result": false, "rc": 0, "start": "2022-07-21 14:53:00.270078" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:33 Thursday 21 July 2022 14:53:01 +0000 (0:00:01.431) 0:03:14.250 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'absent', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:5 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.061) 0:03:14.312 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool.yml:18 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.044) 0:03:14.356 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:1 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.047) 0:03:14.403 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:10 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.088) 0:03:14.492 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:19 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.022) 0:03:14.514 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "0" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:23 Thursday 21 July 2022 14:53:01 +0000 (0:00:00.091) 0:03:14.606 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:27 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.115) 0:03:14.721 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:34 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.049) 0:03:14.771 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:38 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.073) 0:03:14.844 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:42 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.125) 0:03:14.969 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:46 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.025) 0:03:14.995 ********* TASK [Check MD RAID] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:56 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.024) 0:03:15.019 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:6 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.043) 0:03:15.063 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:12 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.023) 0:03:15.087 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:16 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.025) 0:03:15.112 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:20 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.023) 0:03:15.135 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:24 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.024) 0:03:15.159 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:30 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.023) 0:03:15.182 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:36 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.026) 0:03:15.209 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-md.yml:44 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.024) 0:03:15.233 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:59 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.036) 0:03:15.270 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.045) 0:03:15.315 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.046) 0:03:15.362 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.030) 0:03:15.392 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.028) 0:03:15.421 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:62 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.028) 0:03:15.449 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-thin.yml:1 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.048) 0:03:15.498 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:3 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.046) 0:03:15.544 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:8 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.029) 0:03:15.574 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:13 Thursday 21 July 2022 14:53:02 +0000 (0:00:00.027) 0:03:15.602 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-thin.yml:17 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.026) 0:03:15.628 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:65 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.027) 0:03:15.656 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.046) 0:03:15.703 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.051) 0:03:15.754 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.022) 0:03:15.776 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.021) 0:03:15.797 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:68 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.034) 0:03:15.832 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.051) 0:03:15.884 ********* included: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.047) 0:03:15.932 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.032) 0:03:15.964 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.028) 0:03:15.992 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.026) 0:03:16.019 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.026) 0:03:16.046 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.027) 0:03:16.073 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.070) 0:03:16.144 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.026) 0:03:16.170 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-members.yml:71 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.037) 0:03:16.208 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.038) 0:03:16.246 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:2 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.046) 0:03:16.293 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:10 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.048) 0:03:16.341 ********* included: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:6 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.078) 0:03:16.419 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:14 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.045) 0:03:16.465 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:28 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.055) 0:03:16.521 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:37 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.025) 0:03:16.546 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:45 Thursday 21 July 2022 14:53:03 +0000 (0:00:00.053) 0:03:16.599 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:54 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.035) 0:03:16.635 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:58 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.025) 0:03:16.661 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:63 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.025) 0:03:16.687 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-mount.yml:75 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.027) 0:03:16.714 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.036) 0:03:16.751 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.068) 0:03:16.819 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:32 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.025) 0:03:16.844 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:39 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.052) 0:03:16.896 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fstab.yml:49 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.034) 0:03:16.930 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:4 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.033) 0:03:16.963 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-fs.yml:10 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.025) 0:03:16.989 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:4 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.024) 0:03:17.014 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:10 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.414) 0:03:17.428 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:18 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.039) 0:03:17.468 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:24 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.025) 0:03:17.494 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:28 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.038) 0:03:17.532 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-device.yml:33 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.024) 0:03:17.556 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.022) 0:03:17.578 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 14:53:04 +0000 (0:00:00.024) 0:03:17.603 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 14:53:06 +0000 (0:00:01.858) 0:03:19.462 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.026) 0:03:19.488 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:30 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.024) 0:03:19.513 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:38 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.023) 0:03:19.536 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.024) 0:03:19.561 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:49 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.025) 0:03:19.586 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:55 Thursday 21 July 2022 14:53:06 +0000 (0:00:00.025) 0:03:19.612 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:61 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.024) 0:03:19.636 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.025) 0:03:19.661 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:74 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.048) 0:03:19.710 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:79 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.049) 0:03:19.759 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:85 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.041) 0:03:19.801 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:91 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.038) 0:03:19.839 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-encryption.yml:97 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.035) 0:03:19.875 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:7 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.034) 0:03:19.909 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:13 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.038) 0:03:19.948 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:17 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.035) 0:03:19.984 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:21 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.039) 0:03:20.023 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:25 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.037) 0:03:20.060 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:31 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.043) 0:03:20.104 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-md.yml:37 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.036) 0:03:20.140 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:3 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.038) 0:03:20.179 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:9 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.026) 0:03:20.206 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:15 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.043) 0:03:20.249 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:20 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.038) 0:03:20.288 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:25 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.037) 0:03:20.326 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:28 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.039) 0:03:20.365 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:31 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.044) 0:03:20.410 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:36 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.042) 0:03:20.453 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:39 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.038) 0:03:20.492 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:44 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.037) 0:03:20.529 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:47 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.043) 0:03:20.573 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-size.yml:50 Thursday 21 July 2022 14:53:07 +0000 (0:00:00.038) 0:03:20.612 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:6 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.025) 0:03:20.637 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:14 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.025) 0:03:20.663 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:17 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.027) 0:03:20.690 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:22 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.025) 0:03:20.716 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:26 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.040) 0:03:20.756 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:32 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.027) 0:03:20.784 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume-cache.yml:36 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.026) 0:03:20.811 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmpmb3dyg70/tests/test-verify-volume.yml:16 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.026) 0:03:20.838 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:43 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.081) 0:03:20.919 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmpmb3dyg70/tests/verify-role-results.yml:53 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.028) 0:03:20.947 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/fedora-36.qcow2.snap : ok=648 changed=8 unreachable=0 failed=0 skipped=476 rescued=0 ignored=0 Thursday 21 July 2022 14:53:08 +0000 (0:00:00.098) 0:03:21.045 ********* =============================================================================== linux-system-roles.storage : get required packages --------------------- 11.36s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 7.23s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 4.94s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 4.10s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.53s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.19s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : manage the pools and volumes to match the specified state --- 3.18s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 linux-system-roles.storage : get required packages ---------------------- 3.16s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 3.03s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 3.02s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : get required packages ---------------------- 2.91s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 linux-system-roles.storage : make sure blivet is available -------------- 2.45s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 linux-system-roles.storage : get service facts -------------------------- 2.19s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 2.15s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 2.11s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 2.11s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 2.08s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 2.06s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get service facts -------------------------- 2.05s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 linux-system-roles.storage : get required packages ---------------------- 2.04s /tmp/tmpmb3dyg70/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmpdbh6f40u executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_create_thinp_then_remove_scsi_generated.yml ******************** 2 plays in /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove_scsi_generated.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove_scsi_generated.yml:3 Thursday 21 July 2022 18:46:34 +0000 (0:00:00.012) 0:00:00.012 ********* ok: [/cache/fedora-36.qcow2.snap] META: ran handlers TASK [set disk interface for test] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove_scsi_generated.yml:7 Thursday 21 July 2022 18:46:36 +0000 (0:00:01.382) 0:00:01.395 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:2 Thursday 21 July 2022 18:46:36 +0000 (0:00:00.050) 0:00:01.445 ********* ok: [/cache/fedora-36.qcow2.snap] META: ran handlers TASK [include_role : fedora.linux_system_roles.storage] ************************ task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:14 Thursday 21 July 2022 18:46:37 +0000 (0:00:01.008) 0:00:02.454 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:46:37 +0000 (0:00:00.038) 0:00:02.493 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:46:37 +0000 (0:00:00.032) 0:00:02.526 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:46:37 +0000 (0:00:00.569) 0:00:03.095 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:46:37 +0000 (0:00:00.056) 0:00:03.151 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:46:37 +0000 (0:00:00.031) 0:00:03.183 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:46:38 +0000 (0:00:00.032) 0:00:03.215 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:46:38 +0000 (0:00:00.057) 0:00:03.273 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:46:38 +0000 (0:00:00.018) 0:00:03.291 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:46:40 +0000 (0:00:02.597) 0:00:05.888 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:46:40 +0000 (0:00:00.036) 0:00:05.924 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:46:40 +0000 (0:00:00.062) 0:00:05.987 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:46:41 +0000 (0:00:00.783) 0:00:06.770 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:46:41 +0000 (0:00:00.043) 0:00:06.814 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:46:41 +0000 (0:00:00.041) 0:00:06.856 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:46:41 +0000 (0:00:00.038) 0:00:06.895 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:46:41 +0000 (0:00:00.072) 0:00:06.967 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:46:43 +0000 (0:00:01.894) 0:00:08.862 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "inactive", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:46:45 +0000 (0:00:02.225) 0:00:11.088 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:46:45 +0000 (0:00:00.059) 0:00:11.147 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:46:45 +0000 (0:00:00.023) 0:00:11.171 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.575) 0:00:11.746 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.037) 0:00:11.783 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.021) 0:00:11.805 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.036) 0:00:11.841 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.035) 0:00:11.877 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.037) 0:00:11.914 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.037) 0:00:11.951 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.026) 0:00:11.977 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.035) 0:00:12.013 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:46:46 +0000 (0:00:00.026) 0:00:12.039 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:46:47 +0000 (0:00:00.543) 0:00:12.583 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:46:47 +0000 (0:00:00.022) 0:00:12.606 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:17 Thursday 21 July 2022 18:46:48 +0000 (0:00:00.997) 0:00:13.603 ********* included: /tmp/tmppde4z2jm/tests/storage/get_unused_disk.yml for /cache/fedora-36.qcow2.snap TASK [Find unused disks in the system] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/get_unused_disk.yml:2 Thursday 21 July 2022 18:46:48 +0000 (0:00:00.036) 0:00:13.640 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "disks": [ "sda", "sdb", "sdc" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/get_unused_disk.yml:9 Thursday 21 July 2022 18:46:49 +0000 (0:00:01.571) 0:00:15.211 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/tmppde4z2jm/tests/storage/get_unused_disk.yml:14 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.036) 0:00:15.247 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/get_unused_disk.yml:19 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.037) 0:00:15.285 ********* ok: [/cache/fedora-36.qcow2.snap] => { "unused_disks": [ "sda", "sdb", "sdc" ] } TASK [Create a thinpool device] ************************************************ task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:21 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.036) 0:00:15.322 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.043) 0:00:15.365 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.036) 0:00:15.402 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.592) 0:00:15.994 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.084) 0:00:16.078 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.033) 0:00:16.112 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:46:50 +0000 (0:00:00.032) 0:00:16.144 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:46:51 +0000 (0:00:00.078) 0:00:16.223 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:46:51 +0000 (0:00:00.020) 0:00:16.243 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:46:53 +0000 (0:00:02.000) 0:00:18.244 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "state": "present", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "3g", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:46:53 +0000 (0:00:00.037) 0:00:18.281 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:46:53 +0000 (0:00:00.035) 0:00:18.317 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:46:55 +0000 (0:00:02.003) 0:00:20.321 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:46:55 +0000 (0:00:00.047) 0:00:20.368 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:46:55 +0000 (0:00:00.050) 0:00:20.418 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:46:55 +0000 (0:00:00.040) 0:00:20.459 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:46:55 +0000 (0:00:00.048) 0:00:20.507 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:46:57 +0000 (0:00:01.811) 0:00:22.318 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:46:59 +0000 (0:00:02.052) 0:00:24.371 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:46:59 +0000 (0:00:00.058) 0:00:24.429 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:46:59 +0000 (0:00:00.023) 0:00:24.452 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "lvm2", "xfsprogs", "btrfs-progs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:47:04 +0000 (0:00:04.992) 0:00:29.445 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:47:04 +0000 (0:00:00.101) 0:00:29.546 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:47:04 +0000 (0:00:00.024) 0:00:29.571 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdc1", "fs_type": null }, { "action": "create format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sdb1", "fs_type": null }, { "action": "create format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "e2fsprogs", "dosfstools", "lvm2", "xfsprogs", "btrfs-progs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:47:04 +0000 (0:00:00.040) 0:00:29.611 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:47:04 +0000 (0:00:00.039) 0:00:29.651 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:47:04 +0000 (0:00:00.035) 0:00:29.686 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:47:04 +0000 (0:00:00.037) 0:00:29.723 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:47:05 +0000 (0:00:01.068) 0:00:30.792 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:47:06 +0000 (0:00:00.745) 0:00:31.537 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:47:07 +0000 (0:00:00.785) 0:00:32.322 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:47:07 +0000 (0:00:00.428) 0:00:32.751 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:47:07 +0000 (0:00:00.025) 0:00:32.776 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:38 Thursday 21 July 2022 18:47:08 +0000 (0:00:01.024) 0:00:33.801 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:47:08 +0000 (0:00:00.043) 0:00:33.845 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:47:08 +0000 (0:00:00.116) 0:00:33.961 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:47:08 +0000 (0:00:00.036) 0:00:33.998 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "VRYiF9-IY7Q-wqcm-zFnA-pH2T-GW3L-t36A6W" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "8FU0c7-czN0-wzE1-pLCY-NCkN-yPM6-xEKnFs" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "3WNlUE-d8MU-Bh4V-INqY-gvUm-3CSU-5WTY6B" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-18-46-22-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:47:09 +0000 (0:00:00.525) 0:00:34.524 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003375", "end": "2022-07-21 18:47:09.985384", "rc": 0, "start": "2022-07-21 18:47:09.982009" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:47:09 +0000 (0:00:00.504) 0:00:35.029 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003691", "end": "2022-07-21 18:47:10.411197", "failed_when_result": false, "rc": 0, "start": "2022-07-21 18:47:10.407506" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:47:10 +0000 (0:00:00.437) 0:00:35.467 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:47:10 +0000 (0:00:00.068) 0:00:35.535 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:47:10 +0000 (0:00:00.034) 0:00:35.570 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:47:10 +0000 (0:00:00.047) 0:00:35.618 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:47:10 +0000 (0:00:00.053) 0:00:35.671 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:47:11 +0000 (0:00:01.304) 0:00:36.976 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:47:11 +0000 (0:00:00.078) 0:00:37.055 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:47:11 +0000 (0:00:00.062) 0:00:37.117 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.097) 0:00:37.215 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.104) 0:00:37.320 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.049) 0:00:37.369 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.024) 0:00:37.393 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.069) 0:00:37.463 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.048) 0:00:37.512 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.028) 0:00:37.540 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.024) 0:00:37.565 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.024) 0:00:37.590 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.023) 0:00:37.614 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.024) 0:00:37.638 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.023) 0:00:37.662 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.023) 0:00:37.686 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.034) 0:00:37.721 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.045) 0:00:37.766 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.043) 0:00:37.810 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.027) 0:00:37.837 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.027) 0:00:37.865 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.026) 0:00:37.892 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.046) 0:00:37.938 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:47:12 +0000 (0:00:00.045) 0:00:37.984 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv1&&segtype=thin", "vg1" ], "delta": "0:00:00.042229", "end": "2022-07-21 18:47:13.403225", "rc": 0, "start": "2022-07-21 18:47:13.360996" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.463) 0:00:38.448 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.055) 0:00:38.504 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.050) 0:00:38.555 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.039) 0:00:38.594 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.048) 0:00:38.642 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.087) 0:00:38.730 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.032) 0:00:38.763 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.089) 0:00:38.852 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.049) 0:00:38.901 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.047) 0:00:38.949 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.036) 0:00:38.985 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.038) 0:00:39.024 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.037) 0:00:39.061 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.034) 0:00:39.096 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.048) 0:00:39.145 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:47:13 +0000 (0:00:00.048) 0:00:39.194 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.036) 0:00:39.231 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.036) 0:00:39.267 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.037) 0:00:39.304 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.032) 0:00:39.337 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.048) 0:00:39.385 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.049) 0:00:39.434 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.034) 0:00:39.469 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.034) 0:00:39.504 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.037) 0:00:39.541 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.032) 0:00:39.574 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.033) 0:00:39.608 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.049) 0:00:39.657 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.045) 0:00:39.703 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.024) 0:00:39.727 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.023) 0:00:39.751 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.023) 0:00:39.775 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.023) 0:00:39.798 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.024) 0:00:39.823 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.023) 0:00:39.847 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.022) 0:00:39.870 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.033) 0:00:39.903 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.034) 0:00:39.938 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.041) 0:00:39.980 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.078) 0:00:40.059 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:47:14 +0000 (0:00:00.120) 0:00:40.179 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.039) 0:00:40.219 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.054) 0:00:40.273 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.048) 0:00:40.322 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.048) 0:00:40.371 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.050) 0:00:40.421 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.023) 0:00:40.445 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.023) 0:00:40.469 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.022) 0:00:40.491 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.035) 0:00:40.527 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.057) 0:00:40.585 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.047) 0:00:40.633 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.046) 0:00:40.679 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.034) 0:00:40.714 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.033) 0:00:40.747 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.038) 0:00:40.786 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:47:15 +0000 (0:00:00.037) 0:00:40.823 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658429226.489015, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658429224.325015, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1173, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658429224.325015, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:47:16 +0000 (0:00:00.417) 0:00:41.241 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:47:16 +0000 (0:00:00.038) 0:00:41.279 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:47:16 +0000 (0:00:00.039) 0:00:41.319 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:47:16 +0000 (0:00:00.038) 0:00:41.357 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:47:16 +0000 (0:00:00.023) 0:00:41.381 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:47:16 +0000 (0:00:00.039) 0:00:41.420 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:47:16 +0000 (0:00:00.024) 0:00:41.445 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:47:18 +0000 (0:00:02.016) 0:00:43.461 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.024) 0:00:43.485 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.022) 0:00:43.508 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.049) 0:00:43.558 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.024) 0:00:43.582 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.024) 0:00:43.606 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.025) 0:00:43.632 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.024) 0:00:43.656 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.024) 0:00:43.681 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.051) 0:00:43.732 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.050) 0:00:43.783 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.036) 0:00:43.819 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.035) 0:00:43.854 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.039) 0:00:43.893 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.033) 0:00:43.927 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.035) 0:00:43.963 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.036) 0:00:43.999 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.035) 0:00:44.035 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.036) 0:00:44.071 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.036) 0:00:44.108 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.035) 0:00:44.144 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:47:18 +0000 (0:00:00.037) 0:00:44.181 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:47:19 +0000 (0:00:00.567) 0:00:44.748 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:47:19 +0000 (0:00:00.444) 0:00:45.193 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.076) 0:00:45.269 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.088) 0:00:45.358 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.036) 0:00:45.395 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.036) 0:00:45.431 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.036) 0:00:45.467 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.035) 0:00:45.503 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.034) 0:00:45.538 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.039) 0:00:45.578 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.035) 0:00:45.613 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.050) 0:00:45.664 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.044590", "end": "2022-07-21 18:47:21.070170", "rc": 0, "start": "2022-07-21 18:47:21.025580" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=Vwi-aotz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.448) 0:00:46.112 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:47:20 +0000 (0:00:00.050) 0:00:46.162 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.053) 0:00:46.216 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.040) 0:00:46.256 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.038) 0:00:46.294 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.040) 0:00:46.334 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.037) 0:00:46.372 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.034) 0:00:46.406 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.021) 0:00:46.428 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Repeat previous invocation to verify idempotence] ************************ task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:40 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.035) 0:00:46.464 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.050) 0:00:46.514 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.037) 0:00:46.552 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.544) 0:00:47.096 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:47:21 +0000 (0:00:00.095) 0:00:47.192 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:47:22 +0000 (0:00:00.098) 0:00:47.291 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:47:22 +0000 (0:00:00.034) 0:00:47.326 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:47:22 +0000 (0:00:00.057) 0:00:47.383 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:47:22 +0000 (0:00:00.021) 0:00:47.405 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:47:23 +0000 (0:00:01.806) 0:00:49.211 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "3g", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:47:24 +0000 (0:00:00.040) 0:00:49.251 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:47:24 +0000 (0:00:00.037) 0:00:49.289 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:47:27 +0000 (0:00:03.014) 0:00:52.304 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:47:27 +0000 (0:00:00.047) 0:00:52.351 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:47:27 +0000 (0:00:00.046) 0:00:52.398 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:47:27 +0000 (0:00:00.038) 0:00:52.437 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:47:27 +0000 (0:00:00.046) 0:00:52.483 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:47:29 +0000 (0:00:01.955) 0:00:54.438 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:47:31 +0000 (0:00:02.151) 0:00:56.589 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:47:31 +0000 (0:00:00.083) 0:00:56.673 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:47:31 +0000 (0:00:00.022) 0:00:56.696 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "btrfs-progs", "e2fsprogs", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:47:34 +0000 (0:00:03.259) 0:00:59.955 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:47:34 +0000 (0:00:00.037) 0:00:59.993 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:47:34 +0000 (0:00:00.023) 0:01:00.016 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" } ], "packages": [ "btrfs-progs", "e2fsprogs", "lvm2", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:47:34 +0000 (0:00:00.041) 0:01:00.058 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:47:34 +0000 (0:00:00.039) 0:01:00.098 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:47:34 +0000 (0:00:00.039) 0:01:00.137 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:47:34 +0000 (0:00:00.041) 0:01:00.179 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:47:35 +0000 (0:00:00.781) 0:01:00.960 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/vg1-lv1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv1" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:47:36 +0000 (0:00:00.425) 0:01:01.386 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:47:36 +0000 (0:00:00.767) 0:01:02.153 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:47:37 +0000 (0:00:00.421) 0:01:02.575 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:47:37 +0000 (0:00:00.023) 0:01:02.598 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:56 Thursday 21 July 2022 18:47:38 +0000 (0:00:01.063) 0:01:03.662 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:47:38 +0000 (0:00:00.046) 0:01:03.708 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:47:38 +0000 (0:00:00.069) 0:01:03.778 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:47:38 +0000 (0:00:00.038) 0:01:03.816 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "VRYiF9-IY7Q-wqcm-zFnA-pH2T-GW3L-t36A6W" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "8FU0c7-czN0-wzE1-pLCY-NCkN-yPM6-xEKnFs" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "3WNlUE-d8MU-Bh4V-INqY-gvUm-3CSU-5WTY6B" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-18-46-22-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:47:39 +0000 (0:00:00.441) 0:01:04.257 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003330", "end": "2022-07-21 18:47:39.631451", "rc": 0, "start": "2022-07-21 18:47:39.628121" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/mapper/vg1-lv1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:47:39 +0000 (0:00:00.419) 0:01:04.677 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003340", "end": "2022-07-21 18:47:40.045912", "failed_when_result": false, "rc": 0, "start": "2022-07-21 18:47:40.042572" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:47:39 +0000 (0:00:00.413) 0:01:05.090 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:47:39 +0000 (0:00:00.076) 0:01:05.167 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:47:39 +0000 (0:00:00.043) 0:01:05.210 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:47:40 +0000 (0:00:00.073) 0:01:05.284 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:47:40 +0000 (0:00:00.072) 0:01:05.357 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:47:41 +0000 (0:00:01.167) 0:01:06.525 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.049) 0:01:06.575 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.050) 0:01:06.625 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.051) 0:01:06.677 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.040) 0:01:06.718 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.049) 0:01:06.767 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.024) 0:01:06.791 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.070) 0:01:06.862 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.044) 0:01:06.906 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.026) 0:01:06.933 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.025) 0:01:06.958 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.025) 0:01:06.984 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.024) 0:01:07.009 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.026) 0:01:07.036 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.025) 0:01:07.061 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.025) 0:01:07.087 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.035) 0:01:07.123 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:47:41 +0000 (0:00:00.047) 0:01:07.170 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.045) 0:01:07.215 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.028) 0:01:07.244 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.028) 0:01:07.272 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.031) 0:01:07.304 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.050) 0:01:07.355 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.048) 0:01:07.403 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv1&&segtype=thin", "vg1" ], "delta": "0:00:00.042446", "end": "2022-07-21 18:47:42.826303", "rc": 0, "start": "2022-07-21 18:47:42.783857" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.468) 0:01:07.872 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.125) 0:01:07.997 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.055) 0:01:08.053 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.044) 0:01:08.097 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.050) 0:01:08.147 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:47:42 +0000 (0:00:00.052) 0:01:08.200 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.034) 0:01:08.234 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.058) 0:01:08.293 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.052) 0:01:08.345 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.051) 0:01:08.396 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.039) 0:01:08.435 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.039) 0:01:08.475 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.038) 0:01:08.514 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.038) 0:01:08.552 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.051) 0:01:08.604 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.046) 0:01:08.651 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.034) 0:01:08.685 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.035) 0:01:08.721 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.034) 0:01:08.756 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.033) 0:01:08.789 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.048) 0:01:08.838 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.046) 0:01:08.884 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.037) 0:01:08.922 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.036) 0:01:08.959 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.038) 0:01:08.997 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.040) 0:01:09.037 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.035) 0:01:09.072 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.047) 0:01:09.120 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.044) 0:01:09.165 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:47:43 +0000 (0:00:00.023) 0:01:09.188 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.025) 0:01:09.214 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.026) 0:01:09.240 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.053) 0:01:09.294 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.024) 0:01:09.319 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.023) 0:01:09.343 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.024) 0:01:09.367 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.032) 0:01:09.400 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.037) 0:01:09.437 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.047) 0:01:09.484 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.050) 0:01:09.534 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.083) 0:01:09.617 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.045) 0:01:09.663 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 770083, "block_size": 4096, "block_total": 783872, "block_used": 13789, "device": "/dev/mapper/vg1-lv1", "fstype": "xfs", "inode_available": 1572861, "inode_total": 1572864, "inode_used": 3, "mount": "/opt/test1", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 3154259968, "size_total": 3210739712, "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.059) 0:01:09.723 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.053) 0:01:09.776 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.055) 0:01:09.832 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.054) 0:01:09.887 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.027) 0:01:09.915 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.027) 0:01:09.942 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.029) 0:01:09.972 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.038) 0:01:10.010 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.067) 0:01:10.077 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.061) 0:01:10.139 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:47:44 +0000 (0:00:00.051) 0:01:10.190 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.040) 0:01:10.230 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.040) 0:01:10.271 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.042) 0:01:10.313 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.044) 0:01:10.357 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658429226.489015, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658429224.325015, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1173, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658429224.325015, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.416) 0:01:10.774 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.086) 0:01:10.861 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.130) 0:01:10.991 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.038) 0:01:11.030 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.025) 0:01:11.056 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.042) 0:01:11.098 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:47:45 +0000 (0:00:00.025) 0:01:11.124 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:47:47 +0000 (0:00:01.938) 0:01:13.063 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:47:47 +0000 (0:00:00.026) 0:01:13.090 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:47:47 +0000 (0:00:00.025) 0:01:13.116 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:47:47 +0000 (0:00:00.054) 0:01:13.170 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:47:47 +0000 (0:00:00.027) 0:01:13.198 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.025) 0:01:13.223 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.024) 0:01:13.248 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.024) 0:01:13.273 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.027) 0:01:13.301 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.051) 0:01:13.352 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.051) 0:01:13.403 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.040) 0:01:13.444 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.038) 0:01:13.482 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.038) 0:01:13.521 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.076) 0:01:13.597 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.039) 0:01:13.636 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.040) 0:01:13.677 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.042) 0:01:13.719 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.080) 0:01:13.800 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.040) 0:01:13.840 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.038) 0:01:13.879 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:47:48 +0000 (0:00:00.043) 0:01:13.922 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.429) 0:01:14.352 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.461) 0:01:14.813 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.052) 0:01:14.866 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.039) 0:01:14.906 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.038) 0:01:14.944 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.047) 0:01:14.992 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.040) 0:01:15.032 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.038) 0:01:15.071 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.039) 0:01:15.110 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.041) 0:01:15.152 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:47:49 +0000 (0:00:00.037) 0:01:15.189 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.055) 0:01:15.245 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.039736", "end": "2022-07-21 18:47:50.671781", "rc": 0, "start": "2022-07-21 18:47:50.632045" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=Vwi-aotz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.474) 0:01:15.719 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.053) 0:01:15.773 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.055) 0:01:15.829 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.041) 0:01:15.871 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.039) 0:01:15.910 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.040) 0:01:15.951 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.081) 0:01:16.032 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.035) 0:01:16.068 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.022) 0:01:16.090 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Change thinlv fs type] *************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:58 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.039) 0:01:16.130 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:47:50 +0000 (0:00:00.060) 0:01:16.191 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:47:51 +0000 (0:00:00.046) 0:01:16.238 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:47:51 +0000 (0:00:00.621) 0:01:16.859 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:47:51 +0000 (0:00:00.066) 0:01:16.925 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:47:51 +0000 (0:00:00.036) 0:01:16.961 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:47:51 +0000 (0:00:00.040) 0:01:17.002 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:47:51 +0000 (0:00:00.063) 0:01:17.065 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:47:51 +0000 (0:00:00.024) 0:01:17.089 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:47:54 +0000 (0:00:02.164) 0:01:19.254 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "fs_type": "xfs", "name": "lv1", "thin": true, "thin_pool_name": "tpool1" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:47:54 +0000 (0:00:00.055) 0:01:19.309 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:47:54 +0000 (0:00:00.088) 0:01:19.398 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2", "xfsprogs" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:47:57 +0000 (0:00:03.208) 0:01:22.606 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:47:57 +0000 (0:00:00.050) 0:01:22.657 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:47:57 +0000 (0:00:00.051) 0:01:22.709 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:47:57 +0000 (0:00:00.044) 0:01:22.753 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:47:57 +0000 (0:00:00.048) 0:01:22.802 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:47:59 +0000 (0:00:01.813) 0:01:24.615 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:48:01 +0000 (0:00:02.134) 0:01:26.749 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:48:01 +0000 (0:00:00.061) 0:01:26.811 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:48:01 +0000 (0:00:00.025) 0:01:26.836 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "lvm2", "e2fsprogs", "btrfs-progs", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:48:04 +0000 (0:00:03.309) 0:01:30.146 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:48:04 +0000 (0:00:00.039) 0:01:30.185 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:48:04 +0000 (0:00:00.023) 0:01:30.209 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "path": "/opt/test1", "state": "absent" } ], "packages": [ "lvm2", "e2fsprogs", "btrfs-progs", "xfsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:48:05 +0000 (0:00:00.086) 0:01:30.295 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:48:05 +0000 (0:00:00.045) 0:01:30.341 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:48:05 +0000 (0:00:00.086) 0:01:30.428 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'path': '/opt/test1', 'state': 'absent'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "mount_info": { "path": "/opt/test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:48:05 +0000 (0:00:00.508) 0:01:30.936 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:48:06 +0000 (0:00:00.885) 0:01:31.821 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:48:06 +0000 (0:00:00.041) 0:01:31.863 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:48:07 +0000 (0:00:00.823) 0:01:32.687 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:48:07 +0000 (0:00:00.437) 0:01:33.125 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:48:07 +0000 (0:00:00.025) 0:01:33.150 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:72 Thursday 21 July 2022 18:48:08 +0000 (0:00:01.042) 0:01:34.193 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:48:09 +0000 (0:00:00.048) 0:01:34.242 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 3221225472, "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:48:09 +0000 (0:00:00.052) 0:01:34.295 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:48:09 +0000 (0:00:00.042) 0:01:34.337 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "VRYiF9-IY7Q-wqcm-zFnA-pH2T-GW3L-t36A6W" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "8FU0c7-czN0-wzE1-pLCY-NCkN-yPM6-xEKnFs" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "3WNlUE-d8MU-Bh4V-INqY-gvUm-3CSU-5WTY6B" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-18-46-22-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:48:09 +0000 (0:00:00.423) 0:01:34.761 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003259", "end": "2022-07-21 18:48:10.184871", "rc": 0, "start": "2022-07-21 18:48:10.181612" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:48:10 +0000 (0:00:00.468) 0:01:35.230 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003691", "end": "2022-07-21 18:48:10.642020", "failed_when_result": false, "rc": 0, "start": "2022-07-21 18:48:10.638329" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:48:10 +0000 (0:00:00.462) 0:01:35.692 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:48:10 +0000 (0:00:00.062) 0:01:35.754 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:48:10 +0000 (0:00:00.038) 0:01:35.793 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:48:10 +0000 (0:00:00.051) 0:01:35.844 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:48:10 +0000 (0:00:00.059) 0:01:35.903 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:48:11 +0000 (0:00:01.275) 0:01:37.179 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.053) 0:01:37.232 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.054) 0:01:37.286 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.053) 0:01:37.340 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.040) 0:01:37.381 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.054) 0:01:37.435 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.027) 0:01:37.462 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.072) 0:01:37.535 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.045) 0:01:37.580 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.026) 0:01:37.607 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.026) 0:01:37.633 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.026) 0:01:37.659 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.027) 0:01:37.687 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.029) 0:01:37.716 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.029) 0:01:37.746 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.027) 0:01:37.773 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.042) 0:01:37.816 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.047) 0:01:37.863 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.047) 0:01:37.910 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.030) 0:01:37.940 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.030) 0:01:37.971 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.072) 0:01:38.044 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.047) 0:01:38.091 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:48:12 +0000 (0:00:00.047) 0:01:38.139 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv1&&segtype=thin", "vg1" ], "delta": "0:00:00.040988", "end": "2022-07-21 18:48:13.552837", "rc": 0, "start": "2022-07-21 18:48:13.511849" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.462) 0:01:38.601 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.060) 0:01:38.662 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.058) 0:01:38.721 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.045) 0:01:38.766 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.049) 0:01:38.816 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.053) 0:01:38.870 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.033) 0:01:38.903 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.055) 0:01:38.959 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.050) 0:01:39.010 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.050) 0:01:39.060 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.038) 0:01:39.099 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.041) 0:01:39.141 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:48:13 +0000 (0:00:00.039) 0:01:39.180 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.037) 0:01:39.218 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.052) 0:01:39.270 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.051) 0:01:39.321 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.038) 0:01:39.360 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.041) 0:01:39.402 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.038) 0:01:39.440 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.037) 0:01:39.478 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.051) 0:01:39.529 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.090) 0:01:39.620 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.040) 0:01:39.661 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.076) 0:01:39.737 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.039) 0:01:39.777 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.038) 0:01:39.815 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.035) 0:01:39.851 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.050) 0:01:39.901 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.052) 0:01:39.953 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.026) 0:01:39.980 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.027) 0:01:40.007 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.026) 0:01:40.033 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.026) 0:01:40.060 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.026) 0:01:40.086 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.031) 0:01:40.117 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.028) 0:01:40.145 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:48:14 +0000 (0:00:00.038) 0:01:40.184 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.040) 0:01:40.225 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': None, 'name': 'lv1', 'raid_level': None, 'size': 3221225472, 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1', '_kernel_device': '/dev/dm-4', '_raw_kernel_device': '/dev/dm-4'}) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.047) 0:01:40.272 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.055) 0:01:40.327 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.082) 0:01:40.410 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.044) 0:01:40.454 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.061) 0:01:40.516 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.027) 0:01:40.543 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.051) 0:01:40.595 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.041) 0:01:40.637 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.027) 0:01:40.664 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.028) 0:01:40.692 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.027) 0:01:40.719 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.038) 0:01:40.758 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.069) 0:01:40.827 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.110) 0:01:40.938 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.141) 0:01:41.080 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.039) 0:01:41.119 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.037) 0:01:41.157 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:48:15 +0000 (0:00:00.043) 0:01:41.201 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.040) 0:01:41.241 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658429226.489015, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658429224.325015, "dev": 5, "device_type": 64772, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1173, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658429224.325015, "nlink": 1, "path": "/dev/mapper/vg1-lv1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.408) 0:01:41.650 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.038) 0:01:41.689 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.039) 0:01:41.728 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.041) 0:01:41.769 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.026) 0:01:41.796 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.041) 0:01:41.837 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:48:16 +0000 (0:00:00.025) 0:01:41.863 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:48:18 +0000 (0:00:02.206) 0:01:44.069 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:48:18 +0000 (0:00:00.027) 0:01:44.096 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:48:18 +0000 (0:00:00.026) 0:01:44.123 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:48:18 +0000 (0:00:00.056) 0:01:44.180 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:48:18 +0000 (0:00:00.026) 0:01:44.206 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.026) 0:01:44.233 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.026) 0:01:44.259 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.028) 0:01:44.287 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.026) 0:01:44.314 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.091) 0:01:44.405 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.054) 0:01:44.459 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.039) 0:01:44.499 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.038) 0:01:44.538 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.043) 0:01:44.581 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.036) 0:01:44.617 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.038) 0:01:44.656 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.074) 0:01:44.731 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.040) 0:01:44.771 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.040) 0:01:44.811 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.040) 0:01:44.852 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.040) 0:01:44.892 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:48:19 +0000 (0:00:00.039) 0:01:44.932 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.426) 0:01:45.358 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 3221225472, "changed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.430) 0:01:45.789 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "3221225472" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.055) 0:01:45.844 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.039) 0:01:45.884 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.042) 0:01:45.927 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.047) 0:01:45.974 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.040) 0:01:46.015 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.041) 0:01:46.056 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.046) 0:01:46.103 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 3221225472, "changed": false, "failed": false, "lvm": "3g", "parted": "3GiB", "size": "3 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.041) 0:01:46.144 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "3221225472" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:48:20 +0000 (0:00:00.040) 0:01:46.185 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.058) 0:01:46.243 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv1" ], "delta": "0:00:00.049726", "end": "2022-07-21 18:48:21.667748", "rc": 0, "start": "2022-07-21 18:48:21.618022" } STDOUT: LVM2_LV_NAME=lv1 LVM2_LV_ATTR=Vwi-a-tz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.470) 0:01:46.714 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.054) 0:01:46.769 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.054) 0:01:46.823 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.041) 0:01:46.865 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.042) 0:01:46.907 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.041) 0:01:46.948 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.041) 0:01:46.989 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.039) 0:01:47.029 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.024) 0:01:47.053 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create new LV under existing thinpool] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:74 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.081) 0:01:47.134 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:48:21 +0000 (0:00:00.064) 0:01:47.199 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:48:22 +0000 (0:00:00.085) 0:01:47.284 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:48:22 +0000 (0:00:00.559) 0:01:47.844 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:48:22 +0000 (0:00:00.066) 0:01:47.910 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:48:22 +0000 (0:00:00.038) 0:01:47.949 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:48:22 +0000 (0:00:00.037) 0:01:47.986 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:48:22 +0000 (0:00:00.059) 0:01:48.046 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:48:22 +0000 (0:00:00.024) 0:01:48.070 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:48:24 +0000 (0:00:01.940) 0:01:50.011 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test2", "name": "lv2", "size": "4g", "thin": true, "thin_pool_name": "tpool1" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:48:24 +0000 (0:00:00.040) 0:01:50.051 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:48:24 +0000 (0:00:00.040) 0:01:50.092 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:48:27 +0000 (0:00:03.053) 0:01:53.146 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:48:27 +0000 (0:00:00.049) 0:01:53.195 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:48:28 +0000 (0:00:00.046) 0:01:53.242 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:48:28 +0000 (0:00:00.040) 0:01:53.282 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:48:28 +0000 (0:00:00.046) 0:01:53.328 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:48:30 +0000 (0:00:02.042) 0:01:55.371 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:48:32 +0000 (0:00:02.203) 0:01:57.575 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:48:32 +0000 (0:00:00.059) 0:01:57.634 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:48:32 +0000 (0:00:00.031) 0:01:57.666 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "xfsprogs", "btrfs-progs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:48:36 +0000 (0:00:03.589) 0:02:01.255 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:48:36 +0000 (0:00:00.039) 0:02:01.294 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:48:36 +0000 (0:00:00.022) 0:02:01.317 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "create device", "device": "/dev/mapper/vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0", "/dev/mapper/vg1-lv2" ], "mounts": [ { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" } ], "packages": [ "e2fsprogs", "lvm2", "xfsprogs", "btrfs-progs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:48:36 +0000 (0:00:00.040) 0:02:01.358 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:48:36 +0000 (0:00:00.038) 0:02:01.396 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:48:36 +0000 (0:00:00.039) 0:02:01.436 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:48:36 +0000 (0:00:00.043) 0:02:01.480 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:48:37 +0000 (0:00:00.790) 0:02:02.270 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv2', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "opts": "defaults", "passno": 0, "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "mounted" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:48:37 +0000 (0:00:00.501) 0:02:02.771 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:48:38 +0000 (0:00:00.790) 0:02:03.562 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:48:38 +0000 (0:00:00.472) 0:02:04.035 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:48:38 +0000 (0:00:00.023) 0:02:04.059 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:89 Thursday 21 July 2022 18:48:39 +0000 (0:00:01.058) 0:02:05.118 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:48:39 +0000 (0:00:00.089) 0:02:05.207 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "4g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:48:40 +0000 (0:00:00.062) 0:02:05.269 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:48:40 +0000 (0:00:00.038) 0:02:05.307 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" }, "/dev/mapper/vg1-lv2": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv2", "size": "4G", "type": "lvm", "uuid": "5d964f84-346c-4a51-9732-5731e7db2812" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "VRYiF9-IY7Q-wqcm-zFnA-pH2T-GW3L-t36A6W" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "8FU0c7-czN0-wzE1-pLCY-NCkN-yPM6-xEKnFs" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "3WNlUE-d8MU-Bh4V-INqY-gvUm-3CSU-5WTY6B" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-18-46-22-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:48:40 +0000 (0:00:00.449) 0:02:05.757 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004374", "end": "2022-07-21 18:48:42.121953", "rc": 0, "start": "2022-07-21 18:48:41.117579" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/mapper/vg1-lv2 /opt/test2 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:48:41 +0000 (0:00:01.412) 0:02:07.169 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.004383", "end": "2022-07-21 18:48:43.528251", "failed_when_result": false, "rc": 0, "start": "2022-07-21 18:48:42.523868" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:48:43 +0000 (0:00:01.406) 0:02:08.576 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:48:43 +0000 (0:00:00.061) 0:02:08.637 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:48:43 +0000 (0:00:00.034) 0:02:08.672 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:48:43 +0000 (0:00:00.048) 0:02:08.721 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:48:43 +0000 (0:00:00.085) 0:02:08.806 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:48:44 +0000 (0:00:01.201) 0:02:10.008 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:48:44 +0000 (0:00:00.050) 0:02:10.058 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:48:44 +0000 (0:00:00.091) 0:02:10.149 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.065) 0:02:10.215 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.080) 0:02:10.295 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.092) 0:02:10.388 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.026) 0:02:10.414 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.210) 0:02:10.625 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.047) 0:02:10.672 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.030) 0:02:10.703 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.028) 0:02:10.731 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.026) 0:02:10.758 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.025) 0:02:10.783 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.026) 0:02:10.809 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.026) 0:02:10.836 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.028) 0:02:10.864 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.035) 0:02:10.899 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.048) 0:02:10.948 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.046) 0:02:10.995 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.031) 0:02:11.026 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.030) 0:02:11.057 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.030) 0:02:11.087 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.046) 0:02:11.134 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:48:45 +0000 (0:00:00.046) 0:02:11.181 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheading", "-o", "pool_lv", "--select", "lv_name=lv2&&segtype=thin", "vg1" ], "delta": "0:00:00.045687", "end": "2022-07-21 18:48:46.617963", "rc": 0, "start": "2022-07-21 18:48:46.572276" } STDOUT: tpool1 TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.490) 0:02:11.671 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.056) 0:02:11.728 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.053) 0:02:11.782 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lvmraid_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.041) 0:02:11.824 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.049) 0:02:11.873 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.086) 0:02:11.959 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.038) 0:02:11.997 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.055) 0:02:12.053 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.050) 0:02:12.104 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.051) 0:02:12.155 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:48:46 +0000 (0:00:00.036) 0:02:12.192 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.038) 0:02:12.230 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.037) 0:02:12.267 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.036) 0:02:12.304 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.051) 0:02:12.355 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.049) 0:02:12.405 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.037) 0:02:12.442 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.038) 0:02:12.480 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.038) 0:02:12.519 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.036) 0:02:12.555 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.051) 0:02:12.607 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.053) 0:02:12.661 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.038) 0:02:12.699 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.038) 0:02:12.738 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.037) 0:02:12.776 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.036) 0:02:12.812 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.038) 0:02:12.851 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.050) 0:02:12.901 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.047) 0:02:12.949 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.025) 0:02:12.974 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.025) 0:02:13.000 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.027) 0:02:13.028 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.026) 0:02:13.055 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.025) 0:02:13.081 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.064) 0:02:13.145 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.027) 0:02:13.173 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:48:47 +0000 (0:00:00.037) 0:02:13.211 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.042) 0:02:13.254 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2', '_kernel_device': '/dev/dm-5', '_raw_kernel_device': '/dev/dm-5'}) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.046) 0:02:13.300 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.051) 0:02:13.351 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.081) 0:02:13.433 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.042) 0:02:13.476 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [ { "block_available": 1030395, "block_size": 4096, "block_total": 1046016, "block_used": 15621, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 4220497920, "size_total": 4284481536, "uuid": "5d964f84-346c-4a51-9732-5731e7db2812" } ], "storage_test_mount_expected_match_count": "1", "storage_test_mount_point_matches": [ { "block_available": 1030395, "block_size": 4096, "block_total": 1046016, "block_used": 15621, "device": "/dev/mapper/vg1-lv2", "fstype": "xfs", "inode_available": 2097149, "inode_total": 2097152, "inode_used": 3, "mount": "/opt/test2", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=64k,sunit=128,swidth=128,noquota", "size_available": 4220497920, "size_total": 4284481536, "uuid": "5d964f84-346c-4a51-9732-5731e7db2812" } ], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.058) 0:02:13.534 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.052) 0:02:13.586 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.052) 0:02:13.639 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [command] ***************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.051) 0:02:13.690 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.025) 0:02:13.715 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.025) 0:02:13.741 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.025) 0:02:13.766 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.035) 0:02:13.802 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/vg1-lv2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test2 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test2 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.064) 0:02:13.866 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.050) 0:02:13.917 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.052) 0:02:13.969 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.040) 0:02:14.009 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.034) 0:02:14.044 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.039) 0:02:14.084 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:48:48 +0000 (0:00:00.040) 0:02:14.125 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658429317.7190151, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1658429316.122015, "dev": 5, "device_type": 64773, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1814, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1658429316.122015, "nlink": 1, "path": "/dev/mapper/vg1-lv2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:48:49 +0000 (0:00:00.423) 0:02:14.548 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:48:49 +0000 (0:00:00.126) 0:02:14.675 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:48:49 +0000 (0:00:00.039) 0:02:14.715 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:48:49 +0000 (0:00:00.036) 0:02:14.751 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:48:49 +0000 (0:00:00.024) 0:02:14.776 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:48:49 +0000 (0:00:00.037) 0:02:14.814 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:48:49 +0000 (0:00:00.024) 0:02:14.838 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:48:51 +0000 (0:00:02.005) 0:02:16.843 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.027) 0:02:16.871 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.025) 0:02:16.896 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.055) 0:02:16.951 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.026) 0:02:16.978 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.025) 0:02:17.004 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.024) 0:02:17.029 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.026) 0:02:17.055 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.024) 0:02:17.080 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:48:51 +0000 (0:00:00.088) 0:02:17.169 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.052) 0:02:17.221 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.037) 0:02:17.259 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.037) 0:02:17.296 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.039) 0:02:17.336 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.069) 0:02:17.405 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.039) 0:02:17.445 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.071) 0:02:17.517 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.037) 0:02:17.554 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.039) 0:02:17.594 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.037) 0:02:17.632 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.036) 0:02:17.668 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.039) 0:02:17.708 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:48:52 +0000 (0:00:00.401) 0:02:18.109 ********* ok: [/cache/fedora-36.qcow2.snap] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.459) 0:02:18.569 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.051) 0:02:18.620 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.036) 0:02:18.657 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.041) 0:02:18.698 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.038) 0:02:18.737 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.037) 0:02:18.775 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.041) 0:02:18.816 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.037) 0:02:18.854 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.038) 0:02:18.892 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.040) 0:02:18.932 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:48:53 +0000 (0:00:00.058) 0:02:18.991 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "vg1/lv2" ], "delta": "0:00:00.043268", "end": "2022-07-21 18:48:54.396856", "rc": 0, "start": "2022-07-21 18:48:54.353588" } STDOUT: LVM2_LV_NAME=lv2 LVM2_LV_ATTR=Vwi-aotz-- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=thin TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.452) 0:02:19.443 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_lv_segtype": [ "thin" ] }, "changed": false } TASK [check segment type] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.086) 0:02:19.530 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.057) 0:02:19.587 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.040) 0:02:19.627 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.038) 0:02:19.666 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.040) 0:02:19.707 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.038) 0:02:19.745 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.074) 0:02:19.820 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.053) 0:02:19.873 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove existing LV under existing thinpool] ****************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:91 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.034) 0:02:19.907 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.067) 0:02:19.974 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:48:54 +0000 (0:00:00.039) 0:02:20.014 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:48:55 +0000 (0:00:00.558) 0:02:20.572 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:48:55 +0000 (0:00:00.066) 0:02:20.638 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:48:55 +0000 (0:00:00.037) 0:02:20.676 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:48:55 +0000 (0:00:00.035) 0:02:20.711 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:48:55 +0000 (0:00:00.058) 0:02:20.770 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:48:55 +0000 (0:00:00.021) 0:02:20.791 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:48:57 +0000 (0:00:01.872) 0:02:22.664 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "type": "lvm", "volumes": [ { "mount_point": "/opt/test2", "name": "lv2", "state": "absent", "thin": true, "thin_pool_name": "tpool1" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:48:57 +0000 (0:00:00.075) 0:02:22.740 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:48:57 +0000 (0:00:00.074) 0:02:22.815 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:49:01 +0000 (0:00:03.453) 0:02:26.268 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:49:01 +0000 (0:00:00.052) 0:02:26.321 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:49:01 +0000 (0:00:00.049) 0:02:26.371 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:49:01 +0000 (0:00:00.041) 0:02:26.412 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:49:01 +0000 (0:00:00.053) 0:02:26.465 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:49:03 +0000 (0:00:02.127) 0:02:28.592 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:49:05 +0000 (0:00:02.135) 0:02:30.728 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:49:05 +0000 (0:00:00.064) 0:02:30.793 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:49:05 +0000 (0:00:00.029) 0:02:30.823 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" } ], "packages": [ "lvm2", "xfsprogs", "btrfs-progs", "e2fsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:49:09 +0000 (0:00:04.094) 0:02:34.917 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:49:09 +0000 (0:00:00.081) 0:02:34.999 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:49:09 +0000 (0:00:00.026) 0:02:35.025 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv2", "fs_type": null } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/mapper/vg1-lv1", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" } ], "packages": [ "lvm2", "xfsprogs", "btrfs-progs", "e2fsprogs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:49:09 +0000 (0:00:00.134) 0:02:35.160 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:49:09 +0000 (0:00:00.042) 0:02:35.202 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:49:10 +0000 (0:00:00.041) 0:02:35.244 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [/cache/fedora-36.qcow2.snap] => (item={'src': '/dev/mapper/vg1-lv2', 'path': '/opt/test2', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test2", "src": "/dev/mapper/vg1-lv2", "state": "absent" }, "name": "/opt/test2", "opts": "defaults", "passno": "0", "src": "/dev/mapper/vg1-lv2" } TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:49:10 +0000 (0:00:00.450) 0:02:35.695 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:49:11 +0000 (0:00:00.816) 0:02:36.511 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:49:11 +0000 (0:00:00.039) 0:02:36.551 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:49:12 +0000 (0:00:00.791) 0:02:37.342 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:49:12 +0000 (0:00:00.437) 0:02:37.780 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:49:12 +0000 (0:00:00.026) 0:02:37.806 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:106 Thursday 21 July 2022 18:49:13 +0000 (0:00:01.053) 0:02:38.859 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:49:13 +0000 (0:00:00.054) 0:02:38.914 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv2", "_mount_id": "/dev/mapper/vg1-lv2", "_raw_device": "/dev/mapper/vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test2", "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": 4294967296, "state": "absent", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:49:13 +0000 (0:00:00.099) 0:02:39.014 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:49:13 +0000 (0:00:00.045) 0:02:39.059 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/mapper/vg1-lv1": { "fstype": "xfs", "label": "", "name": "/dev/mapper/vg1-lv1", "size": "3G", "type": "lvm", "uuid": "0f1b815b-720d-4a20-abad-011c792b19da" }, "/dev/mapper/vg1-tpool1": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1-tpool": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1-tpool", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tdata": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tdata", "size": "10G", "type": "lvm", "uuid": "" }, "/dev/mapper/vg1-tpool1_tmeta": { "fstype": "", "label": "", "name": "/dev/mapper/vg1-tpool1_tmeta", "size": "12M", "type": "lvm", "uuid": "" }, "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "VRYiF9-IY7Q-wqcm-zFnA-pH2T-GW3L-t36A6W" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdb1", "size": "10G", "type": "partition", "uuid": "8FU0c7-czN0-wzE1-pLCY-NCkN-yPM6-xEKnFs" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc1": { "fstype": "LVM2_member", "label": "", "name": "/dev/sdc1", "size": "10G", "type": "partition", "uuid": "3WNlUE-d8MU-Bh4V-INqY-gvUm-3CSU-5WTY6B" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-18-46-22-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:49:14 +0000 (0:00:00.427) 0:02:39.487 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004470", "end": "2022-07-21 18:49:15.922665", "rc": 0, "start": "2022-07-21 18:49:14.918195" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:49:15 +0000 (0:00:01.484) 0:02:40.971 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003378", "end": "2022-07-21 18:49:16.344108", "failed_when_result": false, "rc": 0, "start": "2022-07-21 18:49:16.340730" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:49:16 +0000 (0:00:00.420) 0:02:41.391 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:49:16 +0000 (0:00:00.063) 0:02:41.455 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:49:16 +0000 (0:00:00.037) 0:02:41.493 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:49:16 +0000 (0:00:00.048) 0:02:41.541 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "3", "_storage_test_pool_pvs_lvm": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:49:16 +0000 (0:00:00.058) 0:02:41.600 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda1", "pv": "/dev/sda1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdb1", "pv": "/dev/sdb1" } ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sdc1", "pv": "/dev/sdc1" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:49:17 +0000 (0:00:01.186) 0:02:42.786 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "3" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:49:17 +0000 (0:00:00.049) 0:02:42.835 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda1", "/dev/sdb1", "/dev/sdc1" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:49:17 +0000 (0:00:00.084) 0:02:42.920 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:49:17 +0000 (0:00:00.051) 0:02:42.971 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:49:17 +0000 (0:00:00.040) 0:02:43.012 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:49:17 +0000 (0:00:00.084) 0:02:43.096 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:49:17 +0000 (0:00:00.026) 0:02:43.123 ********* ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdb1" } MSG: All assertions passed ok: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sdc1" } MSG: All assertions passed TASK [Check MD RAID] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.101) 0:02:43.225 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.044) 0:02:43.269 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.024) 0:02:43.294 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.023) 0:02:43.318 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.028) 0:02:43.346 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.059) 0:02:43.406 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.025) 0:02:43.432 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.024) 0:02:43.456 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.028) 0:02:43.485 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.033) 0:02:43.519 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.045) 0:02:43.564 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.046) 0:02:43.610 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.028) 0:02:43.639 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.027) 0:02:43.667 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.030) 0:02:43.697 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.045) 0:02:43.742 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.043) 0:02:43.786 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.025) 0:02:43.811 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.025) 0:02:43.836 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.026) 0:02:43.862 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.030) 0:02:43.893 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.054) 0:02:43.947 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.056) 0:02:44.003 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sda1) => { "_storage_test_pool_member_path": "/dev/sda1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdb1) => { "_storage_test_pool_member_path": "/dev/sdb1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=/dev/sdc1) => { "_storage_test_pool_member_path": "/dev/sdc1", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.036) 0:02:44.040 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sda1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdb1) included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-36.qcow2.snap => (item=/dev/sdc1) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.055) 0:02:44.096 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.052) 0:02:44.148 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:49:18 +0000 (0:00:00.053) 0:02:44.201 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.038) 0:02:44.239 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.039) 0:02:44.279 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.036) 0:02:44.315 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.040) 0:02:44.356 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.052) 0:02:44.409 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.052) 0:02:44.461 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.037) 0:02:44.499 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.035) 0:02:44.534 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.035) 0:02:44.569 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:1 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.122) 0:02:44.692 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:4 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.049) 0:02:44.741 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:9 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.051) 0:02:44.793 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:15 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.040) 0:02:44.834 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:21 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.036) 0:02:44.871 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-crypttab.yml:27 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.035) 0:02:44.907 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.036) 0:02:44.943 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.035) 0:02:44.978 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.049) 0:02:45.028 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.052) 0:02:45.080 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.024) 0:02:45.105 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.026) 0:02:45.132 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.026) 0:02:45.158 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.025) 0:02:45.184 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:49:19 +0000 (0:00:00.024) 0:02:45.208 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.026) 0:02:45.234 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.026) 0:02:45.261 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.035) 0:02:45.296 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.038) 0:02:45.334 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'lv2', 'raid_level': None, 'size': 4294967296, 'state': 'absent', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': None, 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv2', '_raw_device': '/dev/mapper/vg1-lv2', '_mount_id': '/dev/mapper/vg1-lv2'}) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.043) 0:02:45.378 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.053) 0:02:45.432 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.079) 0:02:45.511 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.047) 0:02:45.558 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.058) 0:02:45.617 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.026) 0:02:45.644 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.052) 0:02:45.696 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.036) 0:02:45.732 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.023) 0:02:45.756 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.036) 0:02:45.793 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.029) 0:02:45.823 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.095) 0:02:45.918 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.141) 0:02:46.060 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.025) 0:02:46.086 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.054) 0:02:46.141 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:49:20 +0000 (0:00:00.053) 0:02:46.194 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.045) 0:02:46.240 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.025) 0:02:46.265 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.024) 0:02:46.290 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.410) 0:02:46.700 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.038) 0:02:46.739 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.026) 0:02:46.765 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.037) 0:02:46.803 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.026) 0:02:46.830 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.026) 0:02:46.856 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:49:21 +0000 (0:00:00.025) 0:02:46.882 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:49:23 +0000 (0:00:02.008) 0:02:48.890 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.026) 0:02:48.917 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.026) 0:02:48.943 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.023) 0:02:48.966 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.022) 0:02:48.989 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.028) 0:02:49.017 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.025) 0:02:49.043 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.025) 0:02:49.069 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.025) 0:02:49.095 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:49:23 +0000 (0:00:00.054) 0:02:49.149 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.086) 0:02:49.236 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.038) 0:02:49.274 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.039) 0:02:49.314 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.038) 0:02:49.352 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.120) 0:02:49.472 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.041) 0:02:49.514 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.039) 0:02:49.554 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.040) 0:02:49.594 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.038) 0:02:49.632 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.038) 0:02:49.671 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.040) 0:02:49.711 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.038) 0:02:49.749 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.027) 0:02:49.777 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.039) 0:02:49.816 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.041) 0:02:49.857 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.039) 0:02:49.897 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.039) 0:02:49.937 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.043) 0:02:49.981 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.038) 0:02:50.019 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.036) 0:02:50.056 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.041) 0:02:50.098 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.040) 0:02:50.138 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.039) 0:02:50.177 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:49:24 +0000 (0:00:00.029) 0:02:50.206 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.027) 0:02:50.234 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.026) 0:02:50.260 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.026) 0:02:50.286 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.026) 0:02:50.313 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.026) 0:02:50.340 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.026) 0:02:50.366 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.025) 0:02:50.392 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.038) 0:02:50.431 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.021) 0:02:50.453 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Cleanup] ***************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:108 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.036) 0:02:50.489 ********* TASK [fedora.linux_system_roles.storage : set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.082) 0:02:50.571 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 21 July 2022 18:49:25 +0000 (0:00:00.043) 0:02:50.615 ********* ok: [/cache/fedora-36.qcow2.snap] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 21 July 2022 18:49:26 +0000 (0:00:00.627) 0:02:51.243 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } ok: [/cache/fedora-36.qcow2.snap] => (item=Fedora.yml) => { "ansible_facts": { "_storage_copr_packages": [ { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" } ], "_storage_copr_support_packages": [ "dnf-plugins-core" ], "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap" ] }, "ansible_included_var_files": [ "/tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml" ], "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-36.qcow2.snap] => (item=Fedora_36.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_36.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Thursday 21 July 2022 18:49:26 +0000 (0:00:00.065) 0:02:51.308 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Thursday 21 July 2022 18:49:26 +0000 (0:00:00.037) 0:02:51.346 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Thursday 21 July 2022 18:49:26 +0000 (0:00:00.037) 0:02:51.383 ********* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Thursday 21 July 2022 18:49:26 +0000 (0:00:00.060) 0:02:51.444 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure blivet is available] ******* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 Thursday 21 July 2022 18:49:26 +0000 (0:00:00.022) 0:02:51.466 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : show storage_pools] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13 Thursday 21 July 2022 18:49:28 +0000 (0:00:01.790) 0:02:53.257 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "vg1", "state": "absent", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "lv1", "size": "3g", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g" } ] } ] } TASK [fedora.linux_system_roles.storage : show storage_volumes] **************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18 Thursday 21 July 2022 18:49:28 +0000 (0:00:00.041) 0:02:53.298 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : get required packages] *************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 Thursday 21 July 2022 18:49:28 +0000 (0:00:00.035) 0:02:53.333 ********* ok: [/cache/fedora-36.qcow2.snap] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35 Thursday 21 July 2022 18:49:31 +0000 (0:00:03.132) 0:02:56.465 ********* included: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-36.qcow2.snap TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 21 July 2022 18:49:31 +0000 (0:00:00.050) 0:02:56.516 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 21 July 2022 18:49:31 +0000 (0:00:00.048) 0:02:56.565 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : enable COPRs] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18 Thursday 21 July 2022 18:49:31 +0000 (0:00:00.041) 0:02:56.607 ********* skipping: [/cache/fedora-36.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']}) => { "ansible_loop_var": "repo", "changed": false, "repo": { "packages": [ "vdo", "kmod-vdo" ], "repository": "rhawalsh/dm-vdo" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : make sure required packages are installed] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 Thursday 21 July 2022 18:49:31 +0000 (0:00:00.085) 0:02:56.692 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : get service facts] ******************* task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Thursday 21 July 2022 18:49:33 +0000 (0:00:02.056) 0:02:58.748 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "bluetooth.service": { "name": "bluetooth.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-os-release.service": { "name": "console-login-helper-messages-gensnippet-os-release.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-login-helper-messages-gensnippet-ssh-keys.service": { "name": "console-login-helper-messages-gensnippet-ssh-keys.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.bluez.service": { "name": "dbus-org.bluez.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.oom1.service": { "name": "dbus-org.freedesktop.oom1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.resolve1.service": { "name": "dbus-org.freedesktop.resolve1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "stopped", "status": "static" }, "fwupd-offline-update.service": { "name": "fwupd-offline-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd-refresh.service": { "name": "fwupd-refresh.service", "source": "systemd", "state": "inactive", "status": "static" }, "fwupd.service": { "name": "fwupd.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:1.service": { "name": "lvm2-pvscan@8:1.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:17.service": { "name": "lvm2-pvscan@8:17.service", "source": "systemd", "state": "stopped", "status": "active" }, "lvm2-pvscan@8:33.service": { "name": "lvm2-pvscan@8:33.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "pcscd.service": { "name": "pcscd.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpmdb-migrate.service": { "name": "rpmdb-migrate.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-system-token.service": { "name": "systemd-boot-system-token.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-FAAC\\x2dBFC8.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service": { "name": "systemd-fsck@dev-disk-by\\x2duuid-cb4982f0\\x2dd861\\x2d4106\\x2dada7\\x2daaeba17ae2bb.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdb1.service": { "name": "systemd-fsck@dev-vdb1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-fsck@dev-vdc1.service": { "name": "systemd-fsck@dev-vdc1.service", "source": "systemd", "state": "stopped", "status": "active" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-homed-activate.service": { "name": "systemd-homed-activate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-homed.service": { "name": "systemd-homed.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-networkd.service": { "name": "systemd-networkd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-oomd.service": { "name": "systemd-oomd.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "running", "status": "enabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-time-wait-sync.service": { "name": "systemd-time-wait-sync.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-userdbd.service": { "name": "systemd-userdbd.service", "source": "systemd", "state": "running", "status": "indirect" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-zram-setup@.service": { "name": "systemd-zram-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-zram-setup@zram0.service": { "name": "systemd-zram-setup@zram0.service", "source": "systemd", "state": "stopped", "status": "active" }, "udisks2.service": { "name": "udisks2.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53 Thursday 21 July 2022 18:49:35 +0000 (0:00:02.116) 0:03:00.864 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Thursday 21 July 2022 18:49:35 +0000 (0:00:00.060) 0:03:00.925 ********* TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Thursday 21 July 2022 18:49:35 +0000 (0:00:00.022) 0:03:00.947 ********* changed: [/cache/fedora-36.qcow2.snap] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs", "btrfs-progs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78 Thursday 21 July 2022 18:49:40 +0000 (0:00:04.739) 0:03:05.687 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.039) 0:03:05.726 ********* TASK [fedora.linux_system_roles.storage : show blivet_output] ****************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.022) 0:03:05.749 ********* ok: [/cache/fedora-36.qcow2.snap] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/mapper/vg1-tpool1", "fs_type": null }, { "action": "destroy device", "device": "/dev/vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdb1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sdc1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sdc1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "disklabel" }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "lvmpv" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sr0", "/dev/vda1", "/dev/vda2", "/dev/vda3", "/dev/vda4", "/dev/vda5", "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/nvme0n1", "/dev/nvme1n1", "/dev/nvme2n1", "/dev/vdb1", "/dev/vdc1", "/dev/vdd", "/dev/vde", "/dev/vdf", "/dev/zram0" ], "mounts": [], "packages": [ "e2fsprogs", "btrfs-progs", "dosfstools" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.040) 0:03:05.790 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.039) 0:03:05.829 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : remove obsolete mounts] ************** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.040) 0:03:05.870 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.035) 0:03:05.905 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : set up new/current mounts] *********** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.026) 0:03:05.932 ********* TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.034) 0:03:05.967 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156 Thursday 21 July 2022 18:49:40 +0000 (0:00:00.024) 0:03:05.991 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "atime": 1658410804.3382256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1658384482.541, "dev": 31, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1658384304.669, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "11", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] *** task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Thursday 21 July 2022 18:49:41 +0000 (0:00:00.460) 0:03:06.452 ********* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 Thursday 21 July 2022 18:49:41 +0000 (0:00:00.026) 0:03:06.479 ********* ok: [/cache/fedora-36.qcow2.snap] META: role_complete for /cache/fedora-36.qcow2.snap TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/tests_create_thinp_then_remove.yml:125 Thursday 21 July 2022 18:49:42 +0000 (0:00:01.075) 0:03:07.555 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml for /cache/fedora-36.qcow2.snap TASK [Print out pool information] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:1 Thursday 21 July 2022 18:49:42 +0000 (0:00:00.057) 0:03:07.612 ********* ok: [/cache/fedora-36.qcow2.snap] => { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "name": "vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/vg1-lv1", "_mount_id": "/dev/mapper/vg1-lv1", "_raw_device": "/dev/mapper/vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "size": "3g", "state": "present", "thin": true, "thin_pool_name": "tpool1", "thin_pool_size": "10g", "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:6 Thursday 21 July 2022 18:49:42 +0000 (0:00:00.049) 0:03:07.661 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:14 Thursday 21 July 2022 18:49:42 +0000 (0:00:00.036) 0:03:07.698 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "info": { "/dev/nvme0n1": { "fstype": "", "label": "", "name": "/dev/nvme0n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme1n1": { "fstype": "", "label": "", "name": "/dev/nvme1n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/nvme2n1": { "fstype": "", "label": "", "name": "/dev/nvme2n1", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sr0": { "fstype": "iso9660", "label": "cidata", "name": "/dev/sr0", "size": "364K", "type": "rom", "uuid": "2022-07-21-18-46-22-00" }, "/dev/vda": { "fstype": "", "label": "", "name": "/dev/vda", "size": "4G", "type": "disk", "uuid": "" }, "/dev/vda1": { "fstype": "", "label": "", "name": "/dev/vda1", "size": "1M", "type": "partition", "uuid": "" }, "/dev/vda2": { "fstype": "ext4", "label": "boot", "name": "/dev/vda2", "size": "1000M", "type": "partition", "uuid": "cb4982f0-d861-4106-ada7-aaeba17ae2bb" }, "/dev/vda3": { "fstype": "vfat", "label": "", "name": "/dev/vda3", "size": "100M", "type": "partition", "uuid": "FAAC-BFC8" }, "/dev/vda4": { "fstype": "", "label": "", "name": "/dev/vda4", "size": "4M", "type": "partition", "uuid": "" }, "/dev/vda5": { "fstype": "btrfs", "label": "fedora", "name": "/dev/vda5", "size": "2.9G", "type": "partition", "uuid": "3e9b04e0-83ba-408b-b132-8988cb220981" }, "/dev/vdb": { "fstype": "", "label": "", "name": "/dev/vdb", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdb1": { "fstype": "ext4", "label": "yumcache", "name": "/dev/vdb1", "size": "2G", "type": "partition", "uuid": "951be07e-05cd-4e0a-a4f5-ac4b1cde40f8" }, "/dev/vdc": { "fstype": "", "label": "", "name": "/dev/vdc", "size": "2G", "type": "disk", "uuid": "" }, "/dev/vdc1": { "fstype": "ext4", "label": "yumvarlib", "name": "/dev/vdc1", "size": "2G", "type": "partition", "uuid": "738681e1-fb1e-40db-9d4a-ae9ebdd619b5" }, "/dev/vdd": { "fstype": "", "label": "", "name": "/dev/vdd", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vde": { "fstype": "", "label": "", "name": "/dev/vde", "size": "10G", "type": "disk", "uuid": "" }, "/dev/vdf": { "fstype": "", "label": "", "name": "/dev/vdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/zram0": { "fstype": "", "label": "", "name": "/dev/zram0", "size": "1.9G", "type": "disk", "uuid": "" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:19 Thursday 21 July 2022 18:49:42 +0000 (0:00:00.427) 0:03:08.125 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003740", "end": "2022-07-21 18:49:43.499652", "rc": 0, "start": "2022-07-21 18:49:43.495912" } STDOUT: # # /etc/fstab # Created by anaconda on Thu Jul 21 06:18:24 2022 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=3e9b04e0-83ba-408b-b132-8988cb220981 / btrfs subvol=root,compress=zstd:1 0 0 UUID=cb4982f0-d861-4106-ada7-aaeba17ae2bb /boot ext4 defaults 1 2 UUID=FAAC-BFC8 /boot/efi vfat defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2 UUID=3e9b04e0-83ba-408b-b132-8988cb220981 /home btrfs subvol=home,compress=zstd:1 0 0 /dev/vdb1 /var/cache/dnf auto defaults,nofail,comment=cloudconfig 0 2 /dev/vdc1 /var/lib/dnf auto defaults,nofail,comment=cloudconfig 0 2 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:24 Thursday 21 July 2022 18:49:43 +0000 (0:00:00.425) 0:03:08.551 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003090", "end": "2022-07-21 18:49:43.914838", "failed_when_result": false, "rc": 0, "start": "2022-07-21 18:49:43.911748" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:33 Thursday 21 July 2022 18:49:43 +0000 (0:00:00.411) 0:03:08.962 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml for /cache/fedora-36.qcow2.snap => (item={'disks': ['sda', 'sdb', 'sdc'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'vg1', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'absent', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}], 'raid_chunk_size': None}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:5 Thursday 21 July 2022 18:49:43 +0000 (0:00:00.070) 0:03:09.032 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool.yml:18 Thursday 21 July 2022 18:49:43 +0000 (0:00:00.040) 0:03:09.073 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml for /cache/fedora-36.qcow2.snap => (item=members) included: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-36.qcow2.snap => (item=volumes) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:1 Thursday 21 July 2022 18:49:43 +0000 (0:00:00.053) 0:03:09.127 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_count": "0", "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:6 Thursday 21 July 2022 18:49:43 +0000 (0:00:00.056) 0:03:09.184 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:15 Thursday 21 July 2022 18:49:43 +0000 (0:00:00.023) 0:03:09.207 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": "0" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:19 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.052) 0:03:09.260 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_pool_pvs": [] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:23 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.052) 0:03:09.312 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:29 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.050) 0:03:09.363 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:33 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.039) 0:03:09.402 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_pv_type": "partition" }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:37 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.054) 0:03:09.457 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:41 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.026) 0:03:09.483 ********* TASK [Check MD RAID] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:50 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.024) 0:03:09.507 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml for /cache/fedora-36.qcow2.snap TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:6 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.087) 0:03:09.594 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:12 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.027) 0:03:09.621 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:16 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.025) 0:03:09.647 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:20 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.023) 0:03:09.671 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:24 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.025) 0:03:09.696 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:30 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.023) 0:03:09.720 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:36 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.024) 0:03:09.745 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-md.yml:44 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.026) 0:03:09.771 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:53 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.040) 0:03:09.811 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-lvmraid.yml:1 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.045) 0:03:09.856 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [Get information about LVM RAID] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:3 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.045) 0:03:09.902 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is LVM RAID] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.028) 0:03:09.930 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-lvmraid.yml:12 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.031) 0:03:09.961 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:56 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.037) 0:03:09.999 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-thin.yml:1 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.050) 0:03:10.049 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:3 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.050) 0:03:10.099 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:8 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.028) 0:03:10.127 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:13 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.029) 0:03:10.157 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-thin.yml:17 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.025) 0:03:10.183 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check member encryption] ************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:59 Thursday 21 July 2022 18:49:44 +0000 (0:00:00.028) 0:03:10.211 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-36.qcow2.snap TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:4 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.052) 0:03:10.264 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:8 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.054) 0:03:10.318 ********* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:15 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.024) 0:03:10.343 ********* TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-encryption.yml:22 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.022) 0:03:10.365 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:62 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.035) 0:03:10.401 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-36.qcow2.snap TASK [Validate pool member VDO settings] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-members-vdo.yml:1 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.050) 0:03:10.451 ********* included: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [get information about VDO deduplication] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:3 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.045) 0:03:10.497 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:8 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.026) 0:03:10.523 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:11 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.026) 0:03:10.550 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:16 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.035) 0:03:10.586 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:21 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.025) 0:03:10.611 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:24 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.025) 0:03:10.637 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:29 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.027) 0:03:10.665 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/verify-pool-member-vdo.yml:39 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.025) 0:03:10.690 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-members.yml:65 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.077) 0:03:10.768 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [verify the volumes] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-pool-volumes.yml:3 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.125) 0:03:10.893 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml for /cache/fedora-36.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'lv1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': 'tpool1', 'thin_pool_size': '10g', 'thin': True, 'vdo_pool_size': None, 'disks': ['sda', 'sdb', 'sdc'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/vg1-lv1', '_raw_device': '/dev/mapper/vg1-lv1', '_mount_id': '/dev/mapper/vg1-lv1'}) TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:2 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.046) 0:03:10.940 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [include_tasks] *********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:10 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.052) 0:03:10.992 ********* included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml for /cache/fedora-36.qcow2.snap => (item=mount) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-36.qcow2.snap => (item=fstab) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml for /cache/fedora-36.qcow2.snap => (item=fs) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml for /cache/fedora-36.qcow2.snap => (item=device) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-36.qcow2.snap => (item=encryption) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml for /cache/fedora-36.qcow2.snap => (item=md) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml for /cache/fedora-36.qcow2.snap => (item=size) included: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml for /cache/fedora-36.qcow2.snap => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:6 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.078) 0:03:11.071 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/vg1-lv1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:10 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.042) 0:03:11.114 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": [], "storage_test_mount_expected_match_count": "0", "storage_test_mount_point_matches": [], "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Verify the current mount state by device] ******************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:20 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.058) 0:03:11.172 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by mount point] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:29 Thursday 21 July 2022 18:49:45 +0000 (0:00:00.026) 0:03:11.199 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify the mount fs type] ************************************************ task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:37 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.051) 0:03:11.250 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [command] ***************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:46 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.036) 0:03:11.287 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:50 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.023) 0:03:11.311 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:55 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.023) 0:03:11.334 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-mount.yml:65 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.023) 0:03:11.358 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_mount_device_matches": null, "storage_test_mount_expected_match_count": null, "storage_test_mount_point_matches": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:2 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.033) 0:03:11.391 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:12 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.060) 0:03:11.452 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:19 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.026) 0:03:11.478 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:25 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.050) 0:03:11.528 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up variables] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fstab.yml:34 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.036) 0:03:11.565 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:4 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.038) 0:03:11.603 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-fs.yml:10 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.026) 0:03:11.630 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:4 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.025) 0:03:11.655 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "stat": { "exists": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:10 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.413) 0:03:12.069 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about this volume] ********************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:15 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.039) 0:03:12.109 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [(1/2) Process volume type (set initial value)] *************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:21 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.025) 0:03:12.135 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [(2/2) Process volume type (get RAID value)] ****************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:25 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.038) 0:03:12.173 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-device.yml:30 Thursday 21 July 2022 18:49:46 +0000 (0:00:00.033) 0:03:12.207 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:3 Thursday 21 July 2022 18:49:47 +0000 (0:00:00.030) 0:03:12.237 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:10 Thursday 21 July 2022 18:49:47 +0000 (0:00:00.026) 0:03:12.264 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:15 Thursday 21 July 2022 18:49:49 +0000 (0:00:02.073) 0:03:14.337 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:21 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.025) 0:03:14.362 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:27 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.024) 0:03:14.387 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:33 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.078) 0:03:14.465 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:39 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.025) 0:03:14.490 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:44 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.023) 0:03:14.514 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:50 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.023) 0:03:14.538 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:56 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.026) 0:03:14.565 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:62 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.025) 0:03:14.590 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:67 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.051) 0:03:14.642 ********* ok: [/cache/fedora-36.qcow2.snap] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:72 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.052) 0:03:14.695 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:78 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.039) 0:03:14.734 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:84 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.035) 0:03:14.769 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:90 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.035) 0:03:14.804 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [get information about RAID] ********************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:7 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.037) 0:03:14.842 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:13 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.041) 0:03:14.884 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:17 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.038) 0:03:14.922 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:21 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.039) 0:03:14.961 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID active devices count] ***************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:25 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.038) 0:03:15.000 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID spare devices count] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:31 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.040) 0:03:15.041 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check RAID metadata version] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-md.yml:37 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.038) 0:03:15.080 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the actual size of the volume] ************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:3 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.038) 0:03:15.118 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested size of the volume] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:9 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.025) 0:03:15.144 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:15 Thursday 21 July 2022 18:49:49 +0000 (0:00:00.039) 0:03:15.184 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:20 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.035) 0:03:15.219 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:25 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.035) 0:03:15.255 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:28 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.035) 0:03:15.290 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Get the size of parent/pool device] ************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:31 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.040) 0:03:15.330 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:36 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.039) 0:03:15.370 ********* skipping: [/cache/fedora-36.qcow2.snap] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:39 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.038) 0:03:15.408 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:44 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.038) 0:03:15.447 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [debug] ******************************************************************* task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:47 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.040) 0:03:15.488 ********* ok: [/cache/fedora-36.qcow2.snap] => { "storage_test_expected_size": "4294967296" } TASK [assert] ****************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-size.yml:50 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.037) 0:03:15.525 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:6 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.024) 0:03:15.550 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:14 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.023) 0:03:15.573 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [check segment type] ****************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:17 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.024) 0:03:15.598 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:22 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.023) 0:03:15.621 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [parse the requested cache size] ****************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:26 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.035) 0:03:15.657 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [set_fact] **************************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:32 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.027) 0:03:15.685 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume-cache.yml:36 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.027) 0:03:15.712 ********* skipping: [/cache/fedora-36.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/tmppde4z2jm/tests/storage/test-verify-volume.yml:16 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.024) 0:03:15.736 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:43 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.035) 0:03:15.772 ********* TASK [Clean up variable namespace] ********************************************* task path: /tmp/tmppde4z2jm/tests/storage/verify-role-results.yml:53 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.023) 0:03:15.795 ********* ok: [/cache/fedora-36.qcow2.snap] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/fedora-36.qcow2.snap : ok=648 changed=8 unreachable=0 failed=0 skipped=476 rescued=0 ignored=0 Thursday 21 July 2022 18:49:50 +0000 (0:00:00.051) 0:03:15.847 ********* =============================================================================== fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 4.99s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 4.74s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 4.09s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 3.59s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : get required packages --------------- 3.45s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 3.31s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 3.26s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 fedora.linux_system_roles.storage : get required packages --------------- 3.21s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 fedora.linux_system_roles.storage : get required packages --------------- 3.13s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 fedora.linux_system_roles.storage : get required packages --------------- 3.05s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 fedora.linux_system_roles.storage : get required packages --------------- 3.01s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 fedora.linux_system_roles.storage : make sure blivet is available ------- 2.60s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 fedora.linux_system_roles.storage : get service facts ------------------- 2.23s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 Ensure cryptsetup is present -------------------------------------------- 2.21s /tmp/tmppde4z2jm/tests/storage/test-verify-volume-encryption.yml:10 ----------- fedora.linux_system_roles.storage : get service facts ------------------- 2.20s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 fedora.linux_system_roles.storage : make sure blivet is available ------- 2.16s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 fedora.linux_system_roles.storage : get service facts ------------------- 2.15s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 fedora.linux_system_roles.storage : get service facts ------------------- 2.14s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 fedora.linux_system_roles.storage : get service facts ------------------- 2.13s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 fedora.linux_system_roles.storage : make sure required packages are installed --- 2.13s /tmp/tmpdbh6f40u/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41